DOE Office of Scientific and Technical Information (OSTI.GOV)
Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.
Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less
Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.
Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D
2016-02-01
Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT
This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...
This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data
NASA Astrophysics Data System (ADS)
Jern, Mikael
Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon
2015-01-01
Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651
ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...
Applying Pragmatics Principles for Interaction with Visual Analytics.
Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac
2018-01-01
Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.
Making Sense of Game-Based User Data: Learning Analytics in Applied Games
ERIC Educational Resources Information Center
Steiner, Christina M.; Kickmeier-Rus, Michael D.; Albert, Dietrich
2015-01-01
Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We…
A results-based process for evaluation of diverse visual analytics tools
NASA Astrophysics Data System (ADS)
Rubin, Gary; Berger, David H.
2013-05-01
With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery
NASA Astrophysics Data System (ADS)
Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.
2017-12-01
Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.
INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...
Process monitoring and visualization solutions for hot-melt extrusion: a review.
Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas
2014-02-01
Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
RacerGISOnline: Enhancing Learning in Marketing Classes with Web-Based Business GIS
ERIC Educational Resources Information Center
Miller, Fred L.; Mangold, W. Glynn; Roach, Joy; Brockway, Gary; Johnston, Timothy; Linnhoff, Stefan; McNeely, Sam; Smith, Kathy; Holmes, Terence
2014-01-01
Geographic Information Systems (GIS) offer geospatial analytical tools with great potential for applications in marketing decision making. However, for various reasons, the rate of adoption of these tools in academic marketing programs has lagged behind that of marketing practitioners. RacerGISOnline is an innovative approach to integrating these…
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A
2018-01-01
Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.
Influence of Wake Models on Calculated Tiltrotor Aerodynamics
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2001-01-01
The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.
Bergues Pupo, Ana E; Reyes, Juan Bory; Bergues Cabrales, Luis E; Bergues Cabrales, Jesús M
2011-09-24
Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections.
Wearable physiological systems and technologies for metabolic monitoring.
Gao, Wei; Brooks, George A; Klonoff, David C
2018-03-01
Wearable sensors allow continuous monitoring of metabolites for diabetes, sports medicine, exercise science, and physiology research. These sensors can continuously detect target analytes in skin interstitial fluid (ISF), tears, saliva, and sweat. In this review, we will summarize developments on wearable devices and their potential applications in research, clinical practice, and recreational and sporting activities. Sampling skin ISF can require insertion of a needle into the skin, whereas sweat, tears, and saliva can be sampled by devices worn outside the body. The most widely sampled metabolite from a wearable device is glucose in skin ISF for monitoring diabetes patients. Continuous ISF glucose monitoring allows estimation of the glucose concentration in blood without the pain, inconvenience, and blood waste of fingerstick capillary blood glucose testing. This tool is currently used by diabetes patients to provide information for dosing insulin and determining a diet and exercise plan. Similar technologies for measuring concentrations of other analytes in skin ISF could be used to monitor athletes, emergency responders, warfighters, and others in states of extreme physiological stress. Sweat is a potentially useful substrate for sampling analytes for metabolic monitoring during exercise. Lactate, sodium, potassium, and hydrogen ions can be measured in sweat. Tools for converting the concentrations of these analytes sampled from sweat, tears, and saliva into blood concentrations are being developed. As an understanding of the relationships between the concentrations of analytes in blood and easily sampled body fluid increases, then the benefits of new wearable devices for metabolic monitoring will also increase.
Shouting in the Jungle - The SETI Transmission Debate
NASA Astrophysics Data System (ADS)
Schuch, H. P.; Almar, I.
The prudence of transmitting deliberate messages from Earth into interstellar space remains controversial. Reasoned risk- benefit analysis is needed, to inform policy recommendations by such bodies as the International Academy of Astronautics SETI Permanent Study Group. As a first step, at the 2005 International Astronautical Congress in Fukuoka, we discussed the San Marino Scale, a new analytical tool for assessing transmission risk. That Scale was updated, and a revised version presented at the 2006 IAC in Valencia. We are now in a position to recommend specific improvements to the scale we proposed for quantifying terrestrial transmissions. Our intent is to make this tool better reflect the detectability and potential impact of recent and proposed messages beamed from Earth. We believe the changes proposed herein strengthen the San Marino Scale as an analytical tool, and bring us closer to its eventual adoption.
Rüdt, Matthias; Briskot, Till; Hubbuch, Jürgen
2017-03-24
Process analytical technologies (PAT) for the manufacturing of biologics have drawn increased interest in the last decade. Besides being encouraged by the Food and Drug Administration's (FDA's) PAT initiative, PAT promises to improve process understanding, reduce overall production costs and help to implement continuous manufacturing. This article focuses on spectroscopic tools for PAT in downstream processing (DSP). Recent advances and future perspectives will be reviewed. In order to exploit the full potential of gathered data, chemometric tools are widely used for the evaluation of complex spectroscopic information. Thus, an introduction into the field will be given. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
Development and Calibration of Highway Safety Manual Equations for Florida Conditions
DOT National Transportation Integrated Search
2011-08-31
The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for quantifying the potential effects on crashes as a result of decisions made in planning, design, operations, and maintenance. Implementation of the new te...
Development and calibration of highway safety manual equations for Florida conditions.
DOT National Transportation Integrated Search
2011-08-31
The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for : quantifying the potential effects on crashes as a result of decisions made in planning, design, : operations, and maintenance. Implementation of the ne...
Realizing the Potential of Mobile Mental Health: New Methods for New Data in Psychiatry
Staples, Patrick; Onnela, Jukka-Pekka
2015-01-01
Smartphones are now ubiquitous and can be harnessed to offer psychiatry a wealth of real-time data regarding patient behavior, self-reported symptoms, and even physiology. The data collected from smartphones meet the three criteria of big data: velocity, volume, and variety. Although these data have tremendous potential, transforming them into clinically valid and useful information requires using new tools and methods as a part of assessment in psychiatry. In this paper, we introduce and explore numerous analytical methods and tools from the computational and statistical sciences that appear readily applicable to psychiatric data collected using smartphones. By matching smartphone data with appropriate statistical methods, psychiatry can better realize the potential of mobile mental health and empower both patients and providers with novel clinical tools. PMID:26073363
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
Integrability and chemical potential in the (3 + 1)-dimensional Skyrme model
NASA Astrophysics Data System (ADS)
Alvarez, P. D.; Canfora, F.; Dimakis, N.; Paliathanasis, A.
2017-10-01
Using a remarkable mapping from the original (3 + 1)dimensional Skyrme model to the Sine-Gordon model, we construct the first analytic examples of Skyrmions as well as of Skyrmions-anti-Skyrmions bound states within a finite box in 3 + 1 dimensional flat space-time. An analytic upper bound on the number of these Skyrmions-anti-Skyrmions bound states is derived. We compute the critical isospin chemical potential beyond which these Skyrmions cease to exist. With these tools, we also construct topologically protected time-crystals: time-periodic configurations whose time-dependence is protected by their non-trivial winding number. These are striking realizations of the ideas of Shapere and Wilczek. The critical isospin chemical potential for these time-crystals is determined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Endert, Alexander; Koch, Kristen
2013-08-01
Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
2011-01-01
Background Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Methods Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Results Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. Conclusion The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections. PMID:21943385
Development and application of accurate analytical models for single active electron potentials
NASA Astrophysics Data System (ADS)
Miller, Michelle; Jaron-Becker, Agnieszka; Becker, Andreas
2015-05-01
The single active electron (SAE) approximation is a theoretical model frequently employed to study scenarios in which inner-shell electrons may productively be treated as frozen spectators to a physical process of interest, and accurate analytical approximations for these potentials are sought as a useful simulation tool. Density function theory is often used to construct a SAE potential, requiring that a further approximation for the exchange correlation functional be enacted. In this study, we employ the Krieger, Li, and Iafrate (KLI) modification to the optimized-effective-potential (OEP) method to reduce the complexity of the problem to the straightforward solution of a system of linear equations through simple arguments regarding the behavior of the exchange-correlation potential in regions where a single orbital dominates. We employ this method for the solution of atomic and molecular potentials, and use the resultant curve to devise a systematic construction for highly accurate and useful analytical approximations for several systems. Supported by the U.S. Department of Energy (Grant No. DE-FG02-09ER16103), and the U.S. National Science Foundation (Graduate Research Fellowship, Grants No. PHY-1125844 and No. PHY-1068706).
Investigating Analytic Tools for e-Book Design in Early Literacy Learning
ERIC Educational Resources Information Center
Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah
2009-01-01
Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…
The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop
2005-08-11
is the matter of intelligence, as COL(P) Keller pointed out, we need to spend less time in the intelligence cycle on managing information and...models, decision aids: "named things " * Methodologies: potentially useful things "* Resources: databases, people, books? * Meta-data on tools * Develop a...experience. Only one member (Mr. Garry Greco) had served on the Joint Intelligence Task Force for Counter Terrorism. Although Gary heavily participated
NASA Astrophysics Data System (ADS)
Qin, Ting; Liao, Congwei; Huang, Shengxiang; Yu, Tianbao; Deng, Lianwen
2018-01-01
An analytical drain current model based on the surface potential is proposed for amorphous indium gallium zinc oxide (a-InGaZnO) thin-film transistors (TFTs) with a synchronized symmetric dual-gate (DG) structure. Solving the electric field, surface potential (φS), and central potential (φ0) of the InGaZnO film using the Poisson equation with the Gaussian method and Lambert function is demonstrated in detail. The compact analytical model of current-voltage behavior, which consists of drift and diffusion components, is investigated by regional integration, and voltage-dependent effective mobility is taken into account. Comparison results demonstrate that the calculation results obtained using the derived models match well with the simulation results obtained using a technology computer-aided design (TCAD) tool. Furthermore, the proposed model is incorporated into SPICE simulations using Verilog-A to verify the feasibility of using DG InGaZnO TFTs for high-performance circuit designs.
Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.
Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L
2013-01-01
Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.
RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION
The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...
Analytical Tools for Investigating and Modeling Agent-Based Systems
2005-06-01
of Black Holes Cluster 10 : Juan M. Maldacena (1924), Journal of High Energy Physics Field theory models for tachyon and gauge field string dy...namics; Super-Poincare Invariant Superstring Field The- ory; Level Four Approximation to the Tachyon Potential in Superstring Field Theory; SO(32) Spinors
Informetrics: Exploring Databases as Analytical Tools.
ERIC Educational Resources Information Center
Wormell, Irene
1998-01-01
Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2018-04-03
Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
Generalization of analytical tools for helicopter-rotor airfoils
NASA Technical Reports Server (NTRS)
Gibbs, E. H.
1979-01-01
A state-of-the-art finite difference boundary-layer program incorporated into the NYU Transonic Analysis Program is described. Some possible treatments for the trailing edge region were investigated. Findings indicate the trailing edge region, still within the scope of an iterative potential flow, boundary layer program, appears feasible.
NASA Technical Reports Server (NTRS)
Kanik, I.; Beegle, L. W.; Hill, H. H.
2001-01-01
The potential of the high-resolution Electrospray Ionization/Ion Mobility Spectrometry (ESI/IMS) technique as analytical separation tool in analyzing bio-molecular mixtures in the search for the chemical signatures of life is demonstrated. Additional information is contained in the original extended abstract.
About, for, in or through Entrepreneurship in Engineering Education
ERIC Educational Resources Information Center
Mäkimurto-Koivumaa, Soili; Belt, Pekka
2016-01-01
Engineering competences form a potential basis for entrepreneurship. There are pressures to find new approaches to entrepreneurship education (EE) in engineering education, as the traditional analytical logic of engineering does not match the modern view of entrepreneurship. Since the previous models do not give tangible enough tools on how to…
Engaging or Distracting: Children's Tablet Computer Use in Education
ERIC Educational Resources Information Center
McEwen, Rhonda N.; Dubé, Adam K.
2015-01-01
Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…
Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study
ERIC Educational Resources Information Center
Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek
2013-01-01
Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Jim S; Greenwood Village, CO 80112
2007-03-31
Strategic Planning and Energy Options Analysis provides the Fort Peck Tribes with a tool to build analytical capabilities and local capacity to extract the natural and energy resource potential for the benefit of the tribal community. Each resource is identified irrespective of the development potential and is viewed as an absolute resulting in a comprehensive resource assessment for Tribal energy planning
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Scalable Visual Analytics of Massive Textual Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.
2007-04-01
This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Chemiluminescence microarrays in analytical chemistry: a critical review.
Seidel, Michael; Niessner, Reinhard
2014-09-01
Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.
VisualUrText: A Text Analytics Tool for Unstructured Textual Data
NASA Astrophysics Data System (ADS)
Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.
2018-05-01
The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.
The generation of criteria for selecting analytical tools for landscape management
Marilyn Duffey-Armstrong
1979-01-01
This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...
Visual analytics for aviation safety: A collaborative approach to sensemaking
NASA Astrophysics Data System (ADS)
Wade, Andrew
Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
NREL Partners with California's Santa Clara Valley Transportation Authority
Automotive Systems Technology Simulator (FASTSim) - a high-level simulation tool for estimating the impact of , manager of the Transportation Systems Group in NREL's Transportation and Hydrogen Systems Center. " ;NREL will provide modeling and analytics on the potential scope of energy services associated with VGI
A Monitoring and Assessment Plan for the Youth Employment and Demonstration Projects Act of 1977.
ERIC Educational Resources Information Center
Employment and Training Administration (DOL), Washington, DC.
Intended as a general blueprint for monitoring and assessing activities under the Youth Employment and Demonstration Projects Act of 1977, this document discusses the expected constraints, evaluation and assessment tools, the analytic framework, and monitoring and review schedule. Five problem areas are recognized as potential constraints in…
NASA Astrophysics Data System (ADS)
Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.
2017-12-01
Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.
Cho, Il-Hoon; Ku, Seockmo
2017-09-30
The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.
Visualization techniques for computer network defense
NASA Astrophysics Data System (ADS)
Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew
2011-06-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.
Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update
Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.
2012-01-01
Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996
Usage Patterns of a Mobile Palliative Care Application.
Zhang, Haipeng; Liu, David; Marks, Sean; Rickerson, Elizabeth M; Wright, Adam; Gordon, William J; Landman, Adam
2018-06-01
Fast Facts Mobile (FFM) was created to be a convenient way for clinicians to access the Fast Facts and Concepts database of palliative care articles on a smartphone or tablet device. We analyzed usage patterns of FFM through an integrated analytics platform on the mobile versions of the FFM application. The primary objective of this study was to evaluate the usage data from FFM as a way to better understand user behavior for FFM as a palliative care educational tool. This is an exploratory, retrospective analysis of de-identified analytics data collected through the iOS and Android versions of FFM captured from November 2015 to November 2016. FFM App download statistics from November 1, 2015, to November 1, 2016, were accessed from the Apple and Google development websites. Further FFM session data were obtained from the analytics platform built into FFM. FFM was downloaded 9409 times over the year with 201,383 articles accessed. The most searched-for terms in FFM include the following: nausea, methadone, and delirium. We compared frequent users of FFM to infrequent users of FFM and found that 13% of all users comprise 66% of all activity in the application. Demand for useful and scalable tools for both primary palliative care and specialty palliative care will likely continue to grow. Understanding the usage patterns for FFM has the potential to inform the development of future versions of Fast Facts. Further studies of mobile palliative care educational tools will be needed to further define the impact of these educational tools.
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
Analytical Web Tool for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Chu, C.; Doelling, D.
2012-12-01
The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the Ordering Tool and add/combine more CERES products to meet the growing data demand.
Optimization of turning process through the analytic flank wear modelling
NASA Astrophysics Data System (ADS)
Del Prete, A.; Franchi, R.; De Lorenzis, D.
2018-05-01
In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.
Laboratory, Field, and Analytical Procedures for Using ...
Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas
21st century toolkit for optimizing population health through precision nutrition.
O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth
2017-07-05
Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
Panesar, Sukhmeet S; Rao, Christopher; Vecht, Joshua A; Mirza, Saqeb B; Netuveli, Gopalakrishnan; Morris, Richard; Rosenthal, Joe; Darzi, Ara; Athanasiou, Thanos
2009-10-01
Meta-analyses may be prone to generating misleading results because of a paucity of experimental studies (especially in surgery); publication bias; and heterogeneity in study design, intervention and the patient population of included studies. When investigating a specific clinical or scientific question on which several relevant meta-analyses may have been published, value judgments must be applied to determine which analysis represents the most robust evidence. These value judgments should be specifically acknowledged. We designed the Veritas plot to explicitly explore important elements of quality and to facilitate decision-making by highlighting specific areas in which meta-analyses are found to be deficient. Furthermore, as a graphic tool, it may be more intuitive than when similar data are presented in a tabular or text format. The Veritas plot is an adaption of the radar plot, a graphic tool for the description of multiattribute data. Key elements of meta-analytical quality such as heterogeneity, publication bias and study design are assessed. Existing qualitative methods such as the Assessment of Multiple Systematic Reviews (AMSTAR) tool have been incorporated in addition to important considerations when interpreting surgical meta-analyses such as the year of publication and population characteristics. To demonstrate the potential of the Veritas plot to inform clinical practice, we apply the Veritas plot to the meta-analytical literature comparing the incidence of 30-day stroke in off-pump coronary artery bypass surgery and conventional coronary artery bypass surgery. We demonstrate that a visually-stimulating and practical evidence-synthesis tool can direct the clinician and scientist to a particular meta-analytical study to inform clinical practice. The Veritas plot is also cumulative and allowed us to assess the quality of evidence over time. We have presented a practical graphic application for scientists and clinicians to identify and interpret variability in meta-analyses. Although further validation of the Veritas plot is required, it may have the potential to contribute to the implementation of evidence-based practice.
Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools
ERIC Educational Resources Information Center
García, Olga Arranz; Secades, Vidal Alonso
2013-01-01
In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…
Networks in Action: New Actors and Practices in Education Policy in Brazil
ERIC Educational Resources Information Center
Shiroma, Eneida Oto
2014-01-01
This paper focuses on the role of networks in the policy-making process in education and discusses the potential of network analysis as an analytical tool for education policy research. Drawing on publically available data from personal or institutional websites, this paper reports the findings from research carried out between 2005 and 2011.…
Code Pulse: Software Assurance (SWA) Visual Analytics for Dynamic Analysis of Code
2014-09-01
31 4.5.1 Market Analysis...competitive market analysis to assess the tool potential. The final transition targets were selected and expressed along with our research on the topic...public release milestones. Details of our testing methodology is in our Software Test Plan deliv- erable, CP- STP -0001. A summary of this approach is
Big Data Analytics in Healthcare
Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Beard, Daniel A.
2015-01-01
The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957
Aptamer based electrochemical sensors for emerging environmental pollutants
Hayat, Akhtar; Marty, Jean L.
2014-01-01
Environmental contaminants monitoring is one of the key issues in understanding and managing hazards to human health and ecosystems. In this context, aptamer based electrochemical sensors have achieved intense significance because of their capability to resolve a potentially large number of problems and challenges in environmental contamination. An aptasensor is a compact analytical device incorporating an aptamer (oligonulceotide) as the sensing element either integrated within or intimately associated with a physiochemical transducer surface. Nucleic acid is well known for the function of carrying and passing genetic information, however, it has found a key role in analytical monitoring during recent years. Aptamer based sensors represent a novelty in environmental analytical science and there are great expectations for their promising performance as alternative to conventional analytical tools. This review paper focuses on the recent advances in the development of aptamer based electrochemical sensors for environmental applications with special emphasis on emerging pollutants. PMID:25019067
Aptamer based electrochemical sensors for emerging environmental pollutants
NASA Astrophysics Data System (ADS)
Hayat, Akhtar; Marty, Jean Louis
2014-06-01
Environmental contaminants monitoring is one of the key issues in understanding and managing hazards to human health and ecosystems. In this context, aptamer based electrochemical sensors have achieved intense significance because of their capability to resolve a potentially large number of problems and challenges in environmental contamination. An aptasensor is a compact analytical device incorporating an aptamer (oligonulceotide) as the sensing element either integrated within or intimately associated with a physiochemical transducer surface. Nucleic acid is well known for the function of carrying and passing genetic information, however, it has found a key role in analytical monitoring during recent years. Aptamer based sensors represent a novelty in environmental analytical science and there are great expectations for their promising performance as alternative to conventional analytical tools. This review paper focuses on the recent advances in the development of aptamer based electrochemical sensors for environmental applications with special emphasis on emerging pollutants.
Big Data Analytics in Healthcare.
Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan
2015-01-01
The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.
Predicting adverse hemodynamic events in critically ill patients.
Yoon, Joo H; Pinsky, Michael R
2018-06-01
The art of predicting future hemodynamic instability in the critically ill has rapidly become a science with the advent of advanced analytical processed based on computer-driven machine learning techniques. How these methods have progressed beyond severity scoring systems to interface with decision-support is summarized. Data mining of large multidimensional clinical time-series databases using a variety of machine learning tools has led to our ability to identify alert artifact and filter it from bedside alarms, display real-time risk stratification at the bedside to aid in clinical decision-making and predict the subsequent development of cardiorespiratory insufficiency hours before these events occur. This fast evolving filed is primarily limited by linkage of high-quality granular to physiologic rationale across heterogeneous clinical care domains. Using advanced analytic tools to glean knowledge from clinical data streams is rapidly becoming a reality whose clinical impact potential is great.
Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.
Mura, Paola
2014-12-01
Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.
Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan
2016-02-01
Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.
2000-05-01
Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
A Progressive Approach to Teaching Analytics in the Marketing Curriculum
ERIC Educational Resources Information Center
Liu, Yiyuan; Levin, Michael A.
2018-01-01
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…
Valdez, Rodolfo; Yoon, Paula W; Qureshi, Nadeem; Green, Ridgely Fisk; Khoury, Muin J
2010-01-01
Family history is a risk factor for many chronic diseases, including cancer, cardiovascular disease, and diabetes. Professional guidelines usually include family history to assess health risk, initiate interventions, and motivate behavioral changes. The advantages of family history over other genomic tools include a lower cost, greater acceptability, and a reflection of shared genetic and environmental factors. However, the utility of family history in public health has been poorly explored. To establish family history as a public health tool, it needs to be evaluated within the ACCE framework (analytical validity; clinical validity; clinical utility; and ethical, legal, and social issues). Currently, private and public organizations are developing tools to collect standardized family histories of many diseases. Their goal is to create family history tools that have decision support capabilities and are compatible with electronic health records. These advances will help realize the potential of family history as a public health tool.
ERIC Educational Resources Information Center
Lockhart, Naorah C.
2017-01-01
Group counselors commonly collaborate in interdisciplinary settings in health care, substance abuse, and juvenile justice. Social network analysis is a methodology rarely used in counseling research yet has potential to examine task group dynamics in new ways. This case study explores the scholarly relationships among 36 members of an…
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
PHLUX: Photographic Flux Tools for Solar Glare and Flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
2010-12-02
A web-based tool to a) analytically and empirically quantify glare from reflected light and determine the potential impact (e.g., temporary flash blindness, retinal burn), and b) produce flux maps for central receivers. The tool accepts RAW digital photographs of the glare source (for hazard assessment) or the receiver (for flux mapping), as well as a photograph of the sun for intensity and size scaling. For glare hazard assessment, the tool determines the retinal irradiance (W/cm2) and subtended source angle for an observer and plots the glare source on a hazard spectrum (i.e., low-potential for flash blindness impact, potential for flashmore » blindness impact, retinal burn). For flux mapping, the tool provides a colored map of the receiver scaled by incident solar flux (W/m2) and unwraps the physical dimensions of the receiver while accounting for the perspective of the photographer (e.g., for a flux map of a cylindrical receiver, the horizontal axis denotes receiver angle in degrees and the vertical axis denotes vertical position in meters; for a flat panel receiver, the horizontal axis denotes horizontal position in meters and the vertical axis denotes vertical position in meters). The flux mapping capability also allows the user to specify transects along which the program plots incident solar flux on the receiver.« less
Fluorescence Spectroscopy for the Monitoring of Food Processes.
Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd
Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.
Visualization Techniques for Computer Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Justin M; Steed, Chad A; Patton, Robert M
2011-01-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less
Experimental and analytical tools for evaluation of Stirling engine rod seal behavior
NASA Technical Reports Server (NTRS)
Krauter, A. I.; Cheng, H. S.
1979-01-01
The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.
Analytics for Cyber Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd.; Kolda, Tamara Gibson
2011-06-01
This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.
Solar Data and Tools: Resources for Researchers, Industry, and Developers
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-01
In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.
User-Driven Sampling Strategies in Image Exploitation
Harvey, Neal R.; Porter, Reid B.
2013-12-23
Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less
User-driven sampling strategies in image exploitation
NASA Astrophysics Data System (ADS)
Harvey, Neal; Porter, Reid
2013-12-01
Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.
Forcisi, Sara; Moritz, Franco; Kanawati, Basem; Tziotis, Dimitrios; Lehmann, Rainer; Schmitt-Kopplin, Philippe
2013-05-31
The present review gives an introduction into the concept of metabolomics and provides an overview of the analytical tools applied in non-targeted metabolomics with a focus on liquid chromatography (LC). LC is a powerful analytical tool in the study of complex sample matrices. A further development and configuration employing Ultra-High Pressure Liquid Chromatography (UHPLC) is optimized to provide the largest known liquid chromatographic resolution and peak capacity. Reasonably UHPLC plays an important role in separation and consequent metabolite identification of complex molecular mixtures such as bio-fluids. The most sensitive detectors for these purposes are mass spectrometers. Almost any mass analyzer can be optimized to identify and quantify small pre-defined sets of targets; however, the number of analytes in metabolomics is far greater. Optimized protocols for quantification of large sets of targets may be rendered inapplicable. Results on small target set analyses on different sample matrices are easily comparable with each other. In non-targeted metabolomics there is almost no analytical method which is applicable to all different matrices due to limitations pertaining to mass analyzers and chromatographic tools. The specifications of the most important interfaces and mass analyzers are discussed. We additionally provide an exemplary application in order to demonstrate the level of complexity which remains intractable up to date. The potential of coupling a high field Fourier Transform Ion Cyclotron Resonance Mass Spectrometer (ICR-FT/MS), the mass analyzer with the largest known mass resolving power, to UHPLC is given with an example of one human pre-treated plasma sample. This experimental example illustrates one way of overcoming the necessity of faster scanning rates in the coupling with UHPLC. The experiment enabled the extraction of thousands of features (analytical signals). A small subset of this compositional space could be mapped into a mass difference network whose topology shows specificity toward putative metabolite classes and retention time. Copyright © 2013 Elsevier B.V. All rights reserved.
Electrostatic Discharge Issues in International Space Station Program EVAs
NASA Technical Reports Server (NTRS)
Bacon, John B.
2009-01-01
EVA activity in the ISS program encounters several dangerous ESD conditions. The ISS program has been aggressive for many years to find ways to mitigate or to eliminate the associated risks. Investments have included: (1) Major mods to EVA tools, suit connectors & analytical tools (2) Floating Potential Measurement Unit (3) Plasma Contactor Units (4) Certification of new ISS flight attitudes (5) Teraflops of computation (6) Thousands of hours of work by scores of specialists (7) Monthly management attention at the highest program levels. The risks are now mitigated to a level that is orders of magnitude safer than prior operations
Mining Mathematics in Textbook Lessons
ERIC Educational Resources Information Center
Ronda, Erlina; Adler, Jill
2017-01-01
In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…
Fire behavior modeling-a decision tool
Jack Cohen; Bill Bradshaw
1986-01-01
The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...
Guidance for the Design and Adoption of Analytic Tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandlow, Alisa
2015-12-01
The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.
Electrochemical detection for microscale analytical systems: a review.
Wang, Joseph
2002-02-11
As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.
NASA Astrophysics Data System (ADS)
Abdolkader, Tarek M.; Shaker, Ahmed; Alahmadi, A. N. M.
2018-07-01
With the continuous miniaturization of electronic devices, quantum-mechanical effects such as tunneling become more effective in many device applications. In this paper, a numerical simulation tool is developed under a MATLAB environment to calculate the tunneling probability and current through an arbitrary potential barrier comparing three different numerical techniques: the finite difference method, transfer matrix method, and transmission line method. For benchmarking, the tool is applied to many case studies such as the rectangular single barrier, rectangular double barrier, and continuous bell-shaped potential barrier, each compared to analytical solutions and giving the dependence of the error on the number of mesh points. In addition, a thorough study of the J ‑ V characteristics of MIM and MIIM diodes, used as rectifiers for rectenna solar cells, is presented and simulations are compared to experimental results showing satisfactory agreement. On the undergraduate level, the tool provides a deeper insight for students to compare numerical techniques used to solve various tunneling problems and helps students to choose a suitable technique for a certain application.
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science
NASA Astrophysics Data System (ADS)
Riedel, Morris; Ramachandran, Rahul; Baumann, Peter
2014-05-01
The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.
Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science
NASA Technical Reports Server (NTRS)
Riedel, Morris; Ramachandran, Rahul; Baumann, Peter
2014-01-01
The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.
IUS solid rocket motor contamination prediction methods
NASA Technical Reports Server (NTRS)
Mullen, C. R.; Kearnes, J. H.
1980-01-01
A series of computer codes were developed to predict solid rocket motor produced contamination to spacecraft sensitive surfaces. Subscale and flight test data have confirmed some of the analytical results. Application of the analysis tools to a typical spacecraft has provided early identification of potential spacecraft contamination problems and provided insight into their solution; e.g., flight plan modifications, plume or outgassing shields and/or contamination covers.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
A review of recent advances in risk analysis for wildfire management
Carol Miller; Alan A. Ager
2012-01-01
Risk analysis evolved out of the need to make decisions concerning highly stochastic events, and is well suited to analyze the timing, location and potential effects of wildfires. Over the past 10 years, the application of risk analysis to wildland fire management has seen steady growth with new risk-based analytical tools that support a wide range of fire and fuels...
Tomaselli Muensterman, Elena; Tisdale, James E
2018-06-08
Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A
2010-01-01
A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
NASA Astrophysics Data System (ADS)
Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.
2015-07-01
The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.
Evidence-based pathology in its second decade: toward probabilistic cognitive computing.
Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R
2017-03-01
Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Fock space, symbolic algebra, and analytical solutions for small stochastic systems.
Santos, Fernando A N; Gadêlha, Hermes; Gaffney, Eamonn A
2015-12-01
Randomness is ubiquitous in nature. From single-molecule biochemical reactions to macroscale biological systems, stochasticity permeates individual interactions and often regulates emergent properties of the system. While such systems are regularly studied from a modeling viewpoint using stochastic simulation algorithms, numerous potential analytical tools can be inherited from statistical and quantum physics, replacing randomness due to quantum fluctuations with low-copy-number stochasticity. Nevertheless, classical studies remained limited to the abstract level, demonstrating a more general applicability and equivalence between systems in physics and biology rather than exploiting the physics tools to study biological systems. Here the Fock space representation, used in quantum mechanics, is combined with the symbolic algebra of creation and annihilation operators to consider explicit solutions for the chemical master equations describing small, well-mixed, biochemical, or biological systems. This is illustrated with an exact solution for a Michaelis-Menten single enzyme interacting with limited substrate, including a consideration of very short time scales, which emphasizes when stiffness is present even for small copy numbers. Furthermore, we present a general matrix representation for Michaelis-Menten kinetics with an arbitrary number of enzymes and substrates that, following diagonalization, leads to the solution of this ubiquitous, nonlinear enzyme kinetics problem. For this, a flexible symbolic maple code is provided, demonstrating the prospective advantages of this framework compared to stochastic simulation algorithms. This further highlights the possibilities for analytically based studies of stochastic systems in biology and chemistry using tools from theoretical quantum physics.
A visual analytics approach for pattern-recognition in patient-generated data.
Feller, Daniel J; Burgermaster, Marissa; Levine, Matthew E; Smaldone, Arlene; Davidson, Patricia G; Albers, David J; Mamykina, Lena
2018-06-13
To develop and test a visual analytics tool to help clinicians identify systematic and clinically meaningful patterns in patient-generated data (PGD) while decreasing perceived information overload. Participatory design was used to develop Glucolyzer, an interactive tool featuring hierarchical clustering and a heatmap visualization to help registered dietitians (RDs) identify associative patterns between blood glucose levels and per-meal macronutrient composition for individuals with type 2 diabetes (T2DM). Ten RDs participated in a within-subjects experiment to compare Glucolyzer to a static logbook format. For each representation, participants had 25 minutes to examine 1 month of diabetes self-monitoring data captured by an individual with T2DM and identify clinically meaningful patterns. We compared the quality and accuracy of the observations generated using each representation. Participants generated 50% more observations when using Glucolyzer (98) than when using the logbook format (64) without any loss in accuracy (69% accuracy vs 62%, respectively, p = .17). Participants identified more observations that included ingredients other than carbohydrates using Glucolyzer (36% vs 16%, p = .027). Fewer RDs reported feelings of information overload using Glucolyzer compared to the logbook format. Study participants displayed variable acceptance of hierarchical clustering. Visual analytics have the potential to mitigate provider concerns about the volume of self-monitoring data. Glucolyzer helped dietitians identify meaningful patterns in self-monitoring data without incurring perceived information overload. Future studies should assess whether similar tools can support clinicians in personalizing behavioral interventions that improve patient outcomes.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
Code of Federal Regulations, 2011 CFR
2011-01-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Can we use high precision metal isotope analysis to improve our understanding of cancer?
Larner, Fiona
2016-01-01
High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.
NASA Astrophysics Data System (ADS)
Reynerson, Charles Martin
This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.
Coastal On-line Assessment and Synthesis Tool 2.0
NASA Technical Reports Server (NTRS)
Brown, Richard; Navard, Andrew; Nguyen, Beth
2011-01-01
COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.
Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham
2007-01-01
This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.
EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.
ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...
Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli
Biokinetics of Nanomaterials: the Role of Biopersistence.
Laux, Peter; Riebeling, Christian; Booth, Andy M; Brain, Joseph D; Brunner, Josephine; Cerrillo, Cristina; Creutzenberg, Otto; Estrela-Lopis, Irina; Gebel, Thomas; Johanson, Gunnar; Jungnickel, Harald; Kock, Heiko; Tentschert, Jutta; Tlili, Ahmed; Schäffer, Andreas; Sips, Adriënne J A M; Yokel, Robert A; Luch, Andreas
2017-04-01
Nanotechnology risk management strategies and environmental regulations continue to rely on hazard and exposure assessment protocols developed for bulk materials, including larger size particles, while commercial application of nanomaterials (NMs) increases. In order to support and corroborate risk assessment of NMs for workers, consumers, and the environment it is crucial to establish the impact of biopersistence of NMs at realistic doses. In the future, such data will allow a more refined future categorization of NMs. Despite many experiments on NM characterization and numerous in vitro and in vivo studies, several questions remain unanswered including the influence of biopersistence on the toxicity of NMs. It is unclear which criteria to apply to characterize a NM as biopersistent. Detection and quantification of NMs, especially determination of their state, i.e., dissolution, aggregation, and agglomeration within biological matrices and other environments are still challenging tasks; moreover mechanisms of nanoparticle (NP) translocation and persistence remain critical gaps. This review summarizes the current understanding of NM biokinetics focusing on determinants of biopersistence. Thorough particle characterization in different exposure scenarios and biological matrices requires use of suitable analytical methods and is a prerequisite to understand biopersistence and for the development of appropriate dosimetry. Analytical tools that potentially can facilitate elucidation of key NM characteristics, such as ion beam microscopy (IBM) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), are discussed in relation to their potential to advance the understanding of biopersistent NM kinetics. We conclude that a major requirement for future nanosafety research is the development and application of analytical tools to characterize NPs in different exposure scenarios and biological matrices.
A history of development in rotordynamics: A manufacturer's perspective
NASA Technical Reports Server (NTRS)
Shemeld, David E.
1987-01-01
The subject of rotordynamics and instability problems in high performance turbomachinery has been a topic of considerable industry discussion and debate over the last 15 or so years. This paper reviews an original equipment manufacturer's history of development of concepts and equipment as applicable to multistage centrifugal compressors. The variety of industry user compression requirements and resultant problematical situations tends to confound many of the theories and analytical techniques set forth. The experiences and examples described herein support the conclusion that the successful addressing of potential rotordynamics problems is best served by a fundamental knowledge of the specific equipment. This in addition to having the appropriate analytical tools. Also, that the final proof is in the doing.
Towards a minimally invasive sampling tool for high resolution tissue analytical mapping
NASA Astrophysics Data System (ADS)
Gottardi, R.
2015-09-01
Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Total Quality Management (TQM), an Overview
1991-09-01
Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
Horizon Missions Methodology - Using new paradigms to overcome conceptual blocks to innovation
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission Methodology was developed to provide a systematic analytical approach for evaluating and identifying technological requirements for breakthrough technology options (BTOs) and for assessing their potential to provide revolutionary capabilities for advanced space missions. Here, attention is given to the further use of the methodology as a new tool for a broader range of studies dealing with technology innovation and new technology paradigms.
Medical Applications of Laser Induced Breakdown Spectroscopy
NASA Astrophysics Data System (ADS)
Pathak, A. K.; Rai, N. K.; Singh, Ankita; Rai, A. K.; Rai, Pradeep K.; Rai, Pramod K.
2014-11-01
Sedentary lifestyle of human beings has resulted in various diseases and in turn we require a potential tool that can be used to address various issues related to human health. Laser Induced Breakdown Spectroscopy (LIBS) is one such potential optical analytical tool that has become quite popular because of its distinctive features that include applicability to any type/phase of samples with almost no sample preparation. Several reports are available that discusses the capabilities of LIBS, suitable for various applications in different branches of science which cannot be addressed by traditional analytical methods but only few reports are available for the medical applications of LIBS. In the present work, LIBS has been implemented to understand the role of various elements in the formation of gallstones (formed under the empyema and mucocele state of gallbladder) samples along with patient history that were collected from Purvancal region of Uttar Pradesh, India. The occurrence statistics of gallstones under the present study reveal higher occurrence of gallstones in female patients. The gallstone occurrence was found more prevalent for those male patients who were having the habit of either tobacco chewing, smoking or drinking alcohols. This work further reports in-situ LIBS study of deciduous tooth and in-vivo LIBS study of human nail.
Gil, F; Hernández, A F
2015-06-01
Human biomonitoring has become an important tool for the assessment of internal doses of metallic and metalloid elements. These elements are of great significance because of their toxic properties and wide distribution in environmental compartments. Although blood and urine are the most used and accepted matrices for human biomonitoring, other non-conventional samples (saliva, placenta, meconium, hair, nails, teeth, breast milk) may have practical advantages and would provide additional information on health risk. Nevertheless, the analysis of these compounds in biological matrices other than blood and urine has not yet been accepted as a useful tool for biomonitoring. The validation of analytical procedures is absolutely necessary for a proper implementation of non-conventional samples in biomonitoring programs. However, the lack of reliable and useful analytical methodologies to assess exposure to metallic elements, and the potential interference of external contamination and variation in biological features of non-conventional samples are important limitations for setting health-based reference values. The influence of potential confounding factors on metallic concentration should always be considered. More research is needed to ascertain whether or not non-conventional matrices offer definitive advantages over the traditional samples and to broaden the available database for establishing worldwide accepted reference values in non-exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Recent advances in applications of nanomaterials for sample preparation.
Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei
2016-01-01
Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, David S.
Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.
DOT National Transportation Integrated Search
2015-01-01
The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...
Yun Chen; Hui Yang
2014-01-01
The rapid advancements of biomedical instrumentation and healthcare technology have resulted in data-rich environments in hospitals. However, the meaningful information extracted from rich datasets is limited. There is a dire need to go beyond current medical practices, and develop data-driven methods and tools that will enable and help (i) the handling of big data, (ii) the extraction of data-driven knowledge, (iii) the exploitation of acquired knowledge for optimizing clinical decisions. This present study focuses on the prediction of mortality rates in Intensive Care Units (ICU) using patient-specific healthcare recordings. It is worth mentioning that postsurgical monitoring in ICU leads to massive datasets with unique properties, e.g., variable heterogeneity, patient heterogeneity, and time asyncronization. To cope with the challenges in ICU datasets, we developed the postsurgical decision support system with a series of analytical tools, including data categorization, data pre-processing, feature extraction, feature selection, and predictive modeling. Experimental results show that the proposed data-driven methodology outperforms traditional approaches and yields better results based on the evaluation of real-world ICU data from 4000 subjects in the database. This research shows great potentials for the use of data-driven analytics to improve the quality of healthcare services.
Cunningham, Virginia L; D'Aco, Vincent J; Pfeiffer, Danielle; Anderson, Paul D; Buzby, Mary E; Hannah, Robert E; Jahnke, James; Parke, Neil J
2012-07-01
This article presents the capability expansion of the PhATE™ (pharmaceutical assessment and transport evaluation) model to predict concentrations of trace organics in sludges and biosolids from municipal wastewater treatment plants (WWTPs). PhATE was originally developed as an empirical model to estimate potential concentrations of active pharmaceutical ingredients (APIs) in US surface and drinking waters that could result from patient use of medicines. However, many compounds, including pharmaceuticals, are not completely transformed in WWTPs and remain in biosolids that may be applied to land as a soil amendment. This practice leads to concerns about potential exposures of people who may come into contact with amended soils and also about potential effects to plants and animals living in or contacting such soils. The model estimates the mass of API in WWTP influent based on the population served, the API per capita use, and the potential loss of the compound associated with human use (e.g., metabolism). The mass of API on the treated biosolids is then estimated based on partitioning to primary and secondary solids, potential loss due to biodegradation in secondary treatment (e.g., activated sludge), and potential loss during sludge treatment (e.g., aerobic digestion, anaerobic digestion, composting). Simulations using 2 surrogate compounds show that predicted environmental concentrations (PECs) generated by PhATE are in very good agreement with measured concentrations, i.e., well within 1 order of magnitude. Model simulations were then carried out for 18 APIs representing a broad range of chemical and use characteristics. These simulations yielded 4 categories of results: 1) PECs are in good agreement with measured data for 9 compounds with high analytical detection frequencies, 2) PECs are greater than measured data for 3 compounds with high analytical detection frequencies, possibly as a result of as yet unidentified depletion mechanisms, 3) PECs are less than analytical reporting limits for 5 compounds with low analytical detection frequencies, and 4) the PEC is greater than the analytical method reporting limit for 1 compound with a low analytical detection frequency, possibly again as a result of insufficient depletion data. Overall, these results demonstrate that PhATE has the potential to be a very useful tool in the evaluation of APIs in biosolids. Possible applications include: prioritizing APIs for assessment even in the absence of analytical methods; evaluating sludge processing scenarios to explore potential mitigation approaches; using in risk assessments; and developing realistic nationwide concentrations, because PECs can be represented as a cumulative probability distribution. Finally, comparison of PECs to measured concentrations can also be used to identify the need for fate studies of compounds of interest in biosolids. Copyright © 2011 SETAC.
PLINK: A Tool Set for Whole-Genome Association and Population-Based Linkage Analyses
Purcell, Shaun ; Neale, Benjamin ; Todd-Brown, Kathe ; Thomas, Lori ; Ferreira, Manuel A. R. ; Bender, David ; Maller, Julian ; Sklar, Pamela ; de Bakker, Paul I. W. ; Daly, Mark J. ; Sham, Pak C.
2007-01-01
Whole-genome association studies (WGAS) bring new computational, as well as analytic, challenges to researchers. Many existing genetic-analysis tools are not designed to handle such large data sets in a convenient manner and do not necessarily exploit the new opportunities that whole-genome data bring. To address these issues, we developed PLINK, an open-source C/C++ WGAS tool set. With PLINK, large data sets comprising hundreds of thousands of markers genotyped for thousands of individuals can be rapidly manipulated and analyzed in their entirety. As well as providing tools to make the basic analytic steps computationally efficient, PLINK also supports some novel approaches to whole-genome data that take advantage of whole-genome coverage. We introduce PLINK and describe the five main domains of function: data management, summary statistics, population stratification, association analysis, and identity-by-descent estimation. In particular, we focus on the estimation and use of identity-by-state and identity-by-descent information in the context of population-based whole-genome studies. This information can be used to detect and correct for population stratification and to identify extended chromosomal segments that are shared identical by descent between very distantly related individuals. Analysis of the patterns of segmental sharing has the potential to map disease loci that contain multiple rare variants in a population-based linkage analysis. PMID:17701901
Genomics Portals: integrative web-platform for mining genomics data.
Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario
2010-01-13
A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.
Genomics Portals: integrative web-platform for mining genomics data
2010-01-01
Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation
FPI: FM Success through Analytics
ERIC Educational Resources Information Center
Hickling, Duane
2013-01-01
The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
Waaijer, Cathelijn J F; Palmblad, Magnus
2015-01-01
In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.
Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Using Google Analytics to evaluate the impact of the CyberTraining project.
McGuckin, Conor; Crowley, Niall
2012-11-01
A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.
Development of computer-based analytical tool for assessing physical protection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less
Control/structure interaction conceptual design tool
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1990-01-01
The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.
Development of computer-based analytical tool for assessing physical protection system
NASA Astrophysics Data System (ADS)
Mardhi, Alim; Pengvanich, Phongphaeth
2016-01-01
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.
Active controls: A look at analytical methods and associated tools
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.
1984-01-01
A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.
Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery
Using Learning Analytics to Support Engagement in Collaborative Writing
ERIC Educational Resources Information Center
Liu, Ming; Pardo, Abelardo; Liu, Li
2017-01-01
Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…
Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis
ERIC Educational Resources Information Center
Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay
2018-01-01
Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…
From Ambiguities to Insights: Query-based Comparisons of High-Dimensional Data
NASA Astrophysics Data System (ADS)
Kowalski, Jeanne; Talbot, Conover; Tsai, Hua L.; Prasad, Nijaguna; Umbricht, Christopher; Zeiger, Martha A.
2007-11-01
Genomic technologies will revolutionize drag discovery and development; that much is universally agreed upon. The high dimension of data from such technologies has challenged available data analytic methods; that much is apparent. To date, large-scale data repositories have not been utilized in ways that permit their wealth of information to be efficiently processed for knowledge, presumably due in large part to inadequate analytical tools to address numerous comparisons of high-dimensional data. In candidate gene discovery, expression comparisons are often made between two features (e.g., cancerous versus normal), such that the enumeration of outcomes is manageable. With multiple features, the setting becomes more complex, in terms of comparing expression levels of tens of thousands transcripts across hundreds of features. In this case, the number of outcomes, while enumerable, become rapidly large and unmanageable, and scientific inquiries become more abstract, such as "which one of these (compounds, stimuli, etc.) is not like the others?" We develop analytical tools that promote more extensive, efficient, and rigorous utilization of the public data resources generated by the massive support of genomic studies. Our work innovates by enabling access to such metadata with logically formulated scientific inquires that define, compare and integrate query-comparison pair relations for analysis. We demonstrate our computational tool's potential to address an outstanding biomedical informatics issue of identifying reliable molecular markers in thyroid cancer. Our proposed query-based comparison (QBC) facilitates access to and efficient utilization of metadata through logically formed inquires expressed as query-based comparisons by organizing and comparing results from biotechnologies to address applications in biomedicine.
NASA Astrophysics Data System (ADS)
Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.
2000-03-01
Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.
Cryogenic Propellant Feed System Analytical Tool Development
NASA Technical Reports Server (NTRS)
Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.
2011-01-01
The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.
Streamflow variability and optimal capacity of run-of-river hydropower plants
NASA Astrophysics Data System (ADS)
Basso, S.; Botter, G.
2012-10-01
The identification of the capacity of a run-of-river plant which allows for the optimal utilization of the available water resources is a challenging task, mainly because of the inherent temporal variability of river flows. This paper proposes an analytical framework to describe the energy production and the economic profitability of small run-of-river power plants on the basis of the underlying streamflow regime. We provide analytical expressions for the capacity which maximize the produced energy as a function of the underlying flow duration curve and minimum environmental flow requirements downstream of the plant intake. Similar analytical expressions are derived for the capacity which maximize the economic return deriving from construction and operation of a new plant. The analytical approach is applied to a minihydro plant recently proposed in a small Alpine catchment in northeastern Italy, evidencing the potential of the method as a flexible and simple design tool for practical application. The analytical model provides useful insight on the major hydrologic and economic controls (e.g., streamflow variability, energy price, costs) on the optimal plant capacity and helps in identifying policy strategies to reduce the current gap between the economic and energy optimizations of run-of-river plants.
ERIC Educational Resources Information Center
Kelly, Nick; Thompson, Kate; Yeoman, Pippa
2015-01-01
This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…
Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A
2007-01-01
The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.
1992-01-01
The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.
Barbin, Douglas Fernandes; Valous, Nektarios A; Dias, Adriana Passos; Camisa, Jaqueline; Hirooka, Elisa Yoko; Yamashita, Fabio
2015-11-01
There is an increasing interest in the use of polysaccharides and proteins for the production of biodegradable films. Visible and near-infrared (VIS-NIR) spectroscopy is a reliable analytical tool for objective analyses of biological sample attributes. The objective is to investigate the potential of VIS-NIR spectroscopy as a process analytical technology for compositional characterization of biodegradable materials and correlation to their mechanical properties. Biofilms were produced by single-screw extrusion with different combinations of polybutylene adipate-co-terephthalate, whole oat flour, glycerol, magnesium stearate, and citric acid. Spectral data were recorded in the range of 400-2498nm at 2nm intervals. Partial least square regression was used to investigate the correlation between spectral information and mechanical properties. Results show that spectral information is influenced by the major constituent components, as they are clustered according to polybutylene adipate-co-terephthalate content. Results for regression models using the spectral information as predictor of tensile properties achieved satisfactory results, with coefficients of prediction (R(2)C) of 0.83, 0.88 and 0.92 (calibration models) for elongation, tensile strength, and Young's modulus, respectively. Results corroborate the correlation of NIR spectra with tensile properties, showing that NIR spectroscopy has potential as a rapid analytical technology for non-destructive assessment of the mechanical properties of the films. Copyright © 2015 Elsevier B.V. All rights reserved.
[Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].
Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou
2014-08-01
In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).
Affinity-based biosensors as promising tools for gene doping detection.
Minunni, Maria; Scarano, Simona; Mascini, Marco
2008-05-01
Innovative bioanalytical approaches can be foreseen as interesting means for solving relevant emerging problems in anti-doping control. Sport authorities fear that the newer form of doping, so-called gene doping, based on a misuse of gene therapy, will be undetectable and thus much less preventable. The World Anti-Doping Agency has already asked scientists to assist in finding ways to prevent and detect this newest kind of doping. In this Opinion article we discuss the main aspects of gene doping, from the putative target analytes to suitable sampling strategies. Moreover, we discuss the potential application of affinity sensing in this field, which so far has been successfully applied to a variety of analytical problems, from clinical diagnostics to food and environmental analysis.
Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R
2013-06-01
Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.
NASA Astrophysics Data System (ADS)
Tan, Yimin; Lin, Kejian; Zu, Jean W.
2018-05-01
Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.
Micro-electromechanical sensors in the analytical field.
Zougagh, Mohammed; Ríos, Angel
2009-07-01
Micro- and nano-electromechanical systems (MEMS and NEMS) for use as sensors represent one of the most exciting new fields in analytical chemistry today. These systems are advantageous over currently available non-miniaturized sensors, such as quartz crystal microbalances, thickness shear mode resonators, and flexural plate wave oscillators, because of their high sensitivity, low cost and easy integration into automated systems. In this article, we present and discuss the evolution in the use of MEMS and NEMS, which are basically cantilever-type sensors, as good analytical tools for a wide variety of applications. We discuss the analytical features and the practical potential of micro(nano)-cantilever sensors, which combine the synergetic advantages of selectivity, provided by their functionalization, and the high sensitivity, which is attributed largely to the extremely small size of the sensing element. An insight is given into the different types of functionalization and detection strategies and a critical discussion is presented on the existing state of the art concerning the applications reported for mechanical microsensors. New developments and the possibilities for routine work in the near future are also covered.
Mischak, Harald; Vlahou, Antonia; Ioannidis, John P A
2013-04-01
Mass spectrometry platforms have attracted a lot of interest in the last 2 decades as profiling tools for native peptides and proteins with clinical potential. However, limitations associated with reproducibility and analytical robustness, especially pronounced with the initial SELDI systems, hindered the application of such platforms in biomarker qualification and clinical implementation. The scope of this article is to give a short overview on data available on performance and on analytical robustness of the different platforms for peptide profiling. Using the CE-MS platform as a paradigm, data on analytical performance are described including reproducibility (short-term and intermediate repeatability), stability, interference, quantification capabilities (limits of detection), and inter-laboratory variability. We discuss these issues by using as an example our experience with the development of a 273-peptide marker for chronic kidney disease. Finally, we discuss pros and cons and means for improvement and emphasize the need to test in terms of comparative clinical performance and impact, different platforms that pass reasonably well analytical validation tests. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul
2017-01-01
The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon
2017-10-01
Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hamed, Maged; Effat, Waleed
2007-08-01
Urban transportation projects are essential in increasing the efficiency of moving people and goods within a city, and between cities. Environmental impacts from such projects must be evaluated and mitigated, as applicable. Spatial modeling is a valuable tool for quantifying the potential level of environmental consequences within the context of an environmental impact assessment (EIA) study. This paper presents a GIS-based tool for the assessment of airborne-noise and ground-borne vibration from public transit systems, and its application to an actual project. The tool is based on the US Federal Transit Administration's (FTA) approach, and incorporates spatial information, satellite imaging, geostatistical modeling, and software programming. The tool is applied on a case study of initial environmental evaluation of a light rail transit project in an urban city in the Middle East, to evaluate alternative layouts. The tool readily allowed the alternative evaluation and the results were used as input to a multi-criteria analytic framework.
Translating statistical species-habitat models to interactive decision support tools
Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.
Translating statistical species-habitat models to interactive decision support tools.
Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.
Translating statistical species-habitat models to interactive decision support tools
Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707
Eichbaum, Kathrin; Brinkmann, Markus; Buchinger, Sebastian; Reifferscheid, Georg; Hecker, Markus; Giesy, John P; Engwall, Magnus; van Bavel, Bert; Hollert, Henner
2014-07-15
Use of in vitro assays as screening tool to characterize contamination of a variety of environmental matrices has become an increasingly popular and powerful toolbox in the field of environmental toxicology. While bioassays cannot entirely substitute analytical methods such as gas chromatography-mass spectrometry (GC-MS), the increasing improvement of cell lines and standardization of bioassay procedures enhance their utility as bioanalytical pre-screening tests prior to more targeted chemical analytical investigations. Dioxin-receptor-based assays provide a holistic characterization of exposure to dioxin-like compounds (DLCs) by integrating their overall toxic potential, including potentials of unknown DLCs not detectable via e.g. GC-MS. Hence, they provide important additional information with respect to environmental risk assessment of DLCs. This review summarizes different in vitro bioassay applications for detection of DLCs and considers the comparability of bioassay and chemical analytically derived toxicity equivalents (TEQs) of different approaches and various matrices. These range from complex samples such as sediments through single reference to compound mixtures. A summary of bioassay derived detection limits (LODs) showed a number of current bioassays to be equally sensitive as chemical methodologies, but moreover revealed that most of the bioanalytical studies conducted to date did not report their LODs, which represents a limitation with regard to low potency samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
Concurrence of big data analytics and healthcare: A systematic review.
Mehta, Nishita; Pandit, Anil
2018-06-01
The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, Mark R.; Hayden, Nancy Kay; Backus, George A.
Most national policy decisions are complex with a variety of stakeholders, disparate interests and the potential for unintended consequences. While a number of analytical tools exist to help decision makers sort through the mountains of data and myriad of options, decision support teams are increasingly turning to complexity science for improved analysis and better insight into the potential impact of policy decisions. While complexity science has great potential, it has only proven useful in limited case s and when properly applied. In advance of more widespread use, a national - level effort to refine complexity science and more rigorously establishmore » its technical underpinnings is recommended.« less
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan
2016-11-01
Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.
Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; Balhoff, James P.; Borromeo, Charles; Brush, Matthew; Carbon, Seth; Conlin, Tom; Dunn, Nathan; Engelstad, Mark; Foster, Erin; Gourdine, J.P.; Jacobsen, Julius O.B.; Keith, Dan; Laraway, Bryan; Lewis, Suzanna E.; NguyenXuan, Jeremy; Shefchek, Kent; Vasilevsky, Nicole; Yuan, Zhou; Washington, Nicole; Hochheiser, Harry; Groza, Tudor; Smedley, Damian; Robinson, Peter N.; Haendel, Melissa A.
2017-01-01
The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype–phenotype associations. Non-human organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research data can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype–phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species. PMID:27899636
Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; ...
2016-11-29
The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype-phenotype associations. Nonhuman organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research datamore » can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype-phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species.« less
Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael
2009-06-01
Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring
Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia
2010-01-01
The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551
[Metabonomics-a useful tool for individualized cancer therapy].
Chai, Yanlan; Wang, Juan; Liu, Zi
2013-11-01
Metabonomics has developed rapidly in post-genome era, and becomes a hot topic of omics. The core idea of metabonomics is to determine the metabolites of relatively low-weight molecular in organisms or cells, by a series of analytical methods such as nuclear magnetic resonance, color spectrum and mass spectrogram, then to transform the data of metabolic pattern into useful information, by chemometric tools and pattern recognition software, and to reveal the essence of life activities of the body. With advantages of high-throughput, high-sensitivity and high-accuracy, metabolomics shows great potential and value in cancer individualized treatment. This paper introduces the concept,contents and methods of metabonomics and reviews its application in cancer individualized therapy.
Lipidomics from an analytical perspective.
Sandra, Koen; Sandra, Pat
2013-10-01
The global non-targeted analysis of various biomolecules in a variety of sample sources gained momentum in recent years. Defined as the study of the full lipid complement of cells, tissues and organisms, lipidomics is currently evolving out of the shadow of the more established omics sciences including genomics, transcriptomics, proteomics and metabolomics. In analogy to the latter, lipidomics has the potential to impact on biomarker discovery, drug discovery/development and system knowledge, amongst others. The tools developed by lipid researchers in the past, complemented with the enormous advancements made in recent years in mass spectrometry and chromatography, and the implementation of sophisticated (bio)-informatics tools form the basis of current lipidomics technologies. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena
The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less
Parallel Aircraft Trajectory Optimization with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Gray, Justin S.; Naylor, Bret
2016-01-01
Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
(Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research
ERIC Educational Resources Information Center
Quiñones, Sandra
2016-01-01
Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…
NASA Technical Reports Server (NTRS)
Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.
1973-01-01
Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.
Challenges and Opportunities in Analysing Students Modelling
ERIC Educational Resources Information Center
Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín
2017-01-01
Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…
ERIC Educational Resources Information Center
Reinholz, Daniel L.; Shah, Niral
2018-01-01
Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…
Creating value in health care through big data: opportunities and policy implications.
Roski, Joachim; Bo-Linn, George W; Andrews, Timothy A
2014-07-01
Big data has the potential to create significant value in health care by improving outcomes while lowering costs. Big data's defining features include the ability to handle massive data volume and variety at high velocity. New, flexible, and easily expandable information technology (IT) infrastructure, including so-called data lakes and cloud data storage and management solutions, make big-data analytics possible. However, most health IT systems still rely on data warehouse structures. Without the right IT infrastructure, analytic tools, visualization approaches, work flows, and interfaces, the insights provided by big data are likely to be limited. Big data's success in creating value in the health care sector may require changes in current polices to balance the potential societal benefits of big-data approaches and the protection of patients' confidentiality. Other policy implications of using big data are that many current practices and policies related to data use, access, sharing, privacy, and stewardship need to be revised. Project HOPE—The People-to-People Health Foundation, Inc.
Principles and applications of Raman spectroscopy in pharmaceutical drug discovery and development.
Gala, Urvi; Chauhan, Harsh
2015-02-01
In recent years, Raman spectroscopy has become increasingly important as an analytical technique in various scientific areas of research and development. This is partly due to the technological advancements in Raman instrumentation and partly due to detailed fingerprinting that can be derived from Raman spectra. Its versatility of applications, rapidness of collection and easy analysis have made Raman spectroscopy an attractive analytical tool. The following review describes Raman spectroscopy and its application within the pharmaceutical industry. The authors explain the theory of Raman scattering and its variations in Raman spectroscopy. The authors also highlight how Raman spectra are interpreted, providing examples. Raman spectroscopy has a number of potential applications within drug discovery and development. It can be used to estimate the molecular activity of drugs and to establish a drug's physicochemical properties such as its partition coefficient. It can also be used in compatibility studies during the drug formulation process. Raman spectroscopy's immense potential should be further investigated in future.
NASA Astrophysics Data System (ADS)
Sarma, Rajkumar; Deka, Nabajit; Sarma, Kuldeep; Mondal, Pranab Kumar
2018-06-01
We present a mathematical model to study the electroosmotic flow of a viscoelastic fluid in a parallel plate microchannel with a high zeta potential, taking hydrodynamic slippage at the walls into account in the underlying analysis. We use the simplified Phan-Thien-Tanner (s-PTT) constitutive relationships to describe the rheological behavior of the viscoelastic fluid, while Navier's slip law is employed to model the interfacial hydrodynamic slip. Here, we derive analytical solutions for the potential distribution, flow velocity, and volumetric flow rate based on the complete Poisson-Boltzmann equation (without considering the frequently used Debye-Hückel linear approximation). For the underlying electrokinetic transport, this investigation primarily reveals the influence of fluid rheology, wall zeta potential as modulated by the interfacial electrochemistry and interfacial slip on the velocity distribution, volumetric flow rate, and fluid stress, as well as the apparent viscosity. We show that combined with the viscoelasticity of the fluid, a higher wall zeta potential and slip coefficient lead to a phenomenal enhancement in the volumetric flow rate. We believe that this analysis, besides providing a deep theoretical insight to interpret the transport process, will also serve as a fundamental design tool for microfluidic devices/systems under electrokinetic influence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Saur, Genevieve; Ramsden, Todd
2015-05-28
This presentation summarizes NREL's hydrogen and fuel cell analysis work in three areas: resource potential, greenhouse gas emissions and cost of delivered energy, and influence of auxiliary revenue streams. NREL's hydrogen and fuel cell analysis projects focus on low-carbon and economic transportation and stationary fuel cell applications. Analysis tools developed by the lab provide insight into the degree to which bridging markets can strengthen the business case for fuel cell applications.
Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph
2016-01-01
Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.
Just love in live organ donation.
Zeiler, Kristin
2009-08-01
Emotionally-related live organ donation is different from almost all other medical treatments in that a family member or, in some countries, a friend contributes with an organ or parts of an organ to the recipient. Furthermore, there is a long-acknowledged but not well-understood gender-imbalance in emotionally-related live kidney donation. This article argues for the benefit of the concept of just love as an analytic tool in the analysis of emotionally-related live organ donation where the potential donor(s) and the recipient are engaged in a love relation. The concept of just love is helpful in the analysis of these live organ donations even if no statistical gender-imbalance prevails. It is particularly helpful, however, in the analysis of the gender-imbalance in live kidney donations if these donations are seen as a specific kind of care-work, if care-work is experienced as a labour one should perform out of love and if women still experience stronger pressures to engage in care-work than do men. The aim of the article is to present arguments for the need of just love as an analytic tool in the analysis of emotionally-related live organ donation where the potential donor(s) and the recipient are engaged in a love relation. The aim is also to elaborate two criteria that need to be met in order for love to qualify as just and to highlight certain clinical implications.
Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes1
NASA Astrophysics Data System (ADS)
Heise, H. M.; Küpper, L.; Butvina, L. N.
2002-10-01
Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring.
Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso
2016-12-20
An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.
Quality Indicators for Learning Analytics
ERIC Educational Resources Information Center
Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus
2014-01-01
This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…
Anandakrishnan, Ramu; Scogland, Tom R. W.; Fenley, Andrew T.; Gordon, John C.; Feng, Wu-chun; Onufriev, Alexey V.
2010-01-01
Tools that compute and visualize biomolecular electrostatic surface potential have been used extensively for studying biomolecular function. However, determining the surface potential for large biomolecules on a typical desktop computer can take days or longer using currently available tools and methods. Two commonly used techniques to speed up these types of electrostatic computations are approximations based on multi-scale coarse-graining and parallelization across multiple processors. This paper demonstrates that for the computation of electrostatic surface potential, these two techniques can be combined to deliver significantly greater speed-up than either one separately, something that is in general not always possible. Specifically, the electrostatic potential computation, using an analytical linearized Poisson Boltzmann (ALPB) method, is approximated using the hierarchical charge partitioning (HCP) multiscale method, and parallelized on an ATI Radeon 4870 graphical processing unit (GPU). The implementation delivers a combined 934-fold speed-up for a 476,040 atom viral capsid, compared to an equivalent non-parallel implementation on an Intel E6550 CPU without the approximation. This speed-up is significantly greater than the 42-fold speed-up for the HCP approximation alone or the 182-fold speed-up for the GPU alone. PMID:20452792
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
Ghosal, Sutapa; Wagner, Jeff
2013-07-07
We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-08
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
Traceability of 'Limone di Siracusa PGI' by a multidisciplinary analytical and chemometric approach.
Amenta, M; Fabroni, S; Costa, C; Rapisarda, P
2016-11-15
Food traceability is increasingly relevant with respect to safety, quality and typicality issues. Lemon fruits grown in a typical lemon-growing area of southern Italy (Siracusa), have been awarded the PGI (Protected Geographical Indication) recognition as 'Limone di Siracusa'. Due to its peculiarity, consumers have an increasing interest about this product. The detection of potential fraud could be improved by using the tools linking the composition of this production to its typical features. This study used a wide range of analytical techniques, including conventional techniques and analytical approaches, such as spectral (NIR spectra), multi-elemental (Fe, Zn, Mn, Cu, Li, Sr) and isotopic ((13)C/(12)C, (18)O/(16)O) marker investigations, joined with multivariate statistical analysis, such as PLS-DA (Partial Least Squares Discriminant Analysis) and LDA (Linear Discriminant Analysis), to implement a traceability system to verify the authenticity of 'Limone di Siracusa' production. The results demonstrated a very good geographical discrimination rate. Copyright © 2016 Elsevier Ltd. All rights reserved.
Decision exploration lab: a visual analytics solution for decision management.
Broeksema, Bertjan; Baudel, Thomas; Telea, Arthur G; Crisafulli, Paolo
2013-12-01
We present a visual analytics solution designed to address prevalent issues in the area of Operational Decision Management (ODM). In ODM, which has its roots in Artificial Intelligence (Expert Systems) and Management Science, it is increasingly important to align business decisions with business goals. In our work, we consider decision models (executable models of the business domain) as ontologies that describe the business domain, and production rules that describe the business logic of decisions to be made over this ontology. Executing a decision model produces an accumulation of decisions made over time for individual cases. We are interested, first, to get insight in the decision logic and the accumulated facts by themselves. Secondly and more importantly, we want to see how the accumulated facts reveal potential divergences between the reality as captured by the decision model, and the reality as captured by the executed decisions. We illustrate the motivation, added value for visual analytics, and our proposed solution and tooling through a business case from the car insurance industry.
Iliuk, Anton B.; Arrington, Justine V.; Tao, Weiguo Andy
2014-01-01
Phosphoproteomics is the systematic study of one of the most common protein modifications in high throughput with the aim of providing detailed information of the control, response, and communication of biological systems in health and disease. Advances in analytical technologies and strategies, in particular the contributions of high-resolution mass spectrometers, efficient enrichments of phosphopeptides, and fast data acquisition and annotation, have catalyzed dramatic expansion of signaling landscapes in multiple systems during the past decade. While phosphoproteomics is an essential inquiry to map high-resolution signaling networks and to find relevant events among the apparently ubiquitous and widespread modifications of proteome, it presents tremendous challenges in separation sciences to translate it from discovery to clinical practice. In this mini-review, we summarize the analytical tools currently utilized for phosphoproteomic analysis (with focus on MS), progresses made on deciphering clinically relevant kinase-substrate networks, MS uses for biomarker discovery and validation, and the potential of phosphoproteomics for disease diagnostics and personalized medicine. PMID:24890697
Big Data Management in US Hospitals: Benefits and Barriers.
Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto
Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.
ERIC Educational Resources Information Center
Davis, Gary Alan; Woratschek, Charles R.
2015-01-01
Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…
Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.
2015-01-01
Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options. PMID:25848413
Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C
2015-01-01
The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.
NASA Astrophysics Data System (ADS)
Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.
2016-12-01
Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.
Review of spectral imaging technology in biomedical engineering: achievements and challenges.
Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin
2013-10-01
Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Llano, Daniel A; Devanarayan, Viswanath; Simon, Adam J
2013-01-01
Previous studies that have examined the potential for plasma markers to serve as biomarkers for Alzheimer disease (AD) have studied single analytes and focused on the amyloid-β and τ isoforms and have failed to yield conclusive results. In this study, we performed a multivariate analysis of 146 plasma analytes (the Human DiscoveryMAP v 1.0 from Rules-Based Medicine) in 527 subjects with AD, mild cognitive impairment (MCI), or cognitively normal elderly subjects from the Alzheimer's Disease Neuroimaging Initiative database. We identified 4 different proteomic signatures, each using 5 to 14 analytes, that differentiate AD from control patients with sensitivity and specificity ranging from 74% to 85%. Five analytes were common to all 4 signatures: apolipoprotein A-II, apolipoprotein E, serum glutamic oxaloacetic transaminase, α-1-microglobulin, and brain natriuretic peptide. None of the signatures adequately predicted progression from MCI to AD over a 12- and 24-month period. A new panel of analytes, optimized to predict MCI to AD conversion, was able to provide 55% to 60% predictive accuracy. These data suggest that a simple panel of plasma analytes may provide an adjunctive tool to differentiate AD from controls, may provide mechanistic insights to the etiology of AD, but cannot adequately predict MCI to AD conversion.
Earth Science Data Analytics: Preparing for Extracting Knowledge from Information
NASA Technical Reports Server (NTRS)
Kempler, Steven; Barbieri, Lindsay
2016-01-01
Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).
Assessment of catchments' flooding potential: a physically-based analytical tool
NASA Astrophysics Data System (ADS)
Botter, G.; Basso, S.; Schirmer, M.
2016-12-01
The assessment of the flooding potential of river catchments is critical in many research and applied fields, ranging from river science and geomorphology to urban planning and the insurance industry. Predicting magnitude and frequency of floods is key to prevent and mitigate the negative effects of high flows, and has therefore long been the focus of hydrologic research. Here, the recurrence intervals of seasonal flow maxima are estimated through a novel physically-based analytic approach, which links the extremal distribution of streamflows to the stochastic dynamics of daily discharge. An analytical expression of the seasonal flood-frequency curve is provided, whose parameters embody climate and landscape attributes of the contributing catchment and can be estimated from daily rainfall and streamflow data. Only one parameter, which expresses catchment saturation prior to rainfall events, needs to be calibrated on the observed maxima. The method has been tested in a set of catchments featuring heterogeneous daily flow regimes. The model is able to reproduce characteristic shapes of flood-frequency curves emerging in erratic and persistent flow regimes and provides good estimates of seasonal flow maxima in different climatic regions. Performances are steady when the magnitude of events with return times longer than the available sample size is estimated. This makes the approach especially valuable for regions affected by data scarcity.
Analytical Protein Microarrays: Advancements Towards Clinical Applications
Sauer, Ursula
2017-01-01
Protein microarrays represent a powerful technology with the potential to serve as tools for the detection of a broad range of analytes in numerous applications such as diagnostics, drug development, food safety, and environmental monitoring. Key features of analytical protein microarrays include high throughput and relatively low costs due to minimal reagent consumption, multiplexing, fast kinetics and hence measurements, and the possibility of functional integration. So far, especially fundamental studies in molecular and cell biology have been conducted using protein microarrays, while the potential for clinical, notably point-of-care applications is not yet fully utilized. The question arises what features have to be implemented and what improvements have to be made in order to fully exploit the technology. In the past we have identified various obstacles that have to be overcome in order to promote protein microarray technology in the diagnostic field. Issues that need significant improvement to make the technology more attractive for the diagnostic market are for instance: too low sensitivity and deficiency in reproducibility, inadequate analysis time, lack of high-quality antibodies and validated reagents, lack of automation and portable instruments, and cost of instruments necessary for chip production and read-out. The scope of the paper at hand is to review approaches to solve these problems. PMID:28146048
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
The control of flexible structure vibrations using a cantilevered adaptive truss
NASA Technical Reports Server (NTRS)
Wynn, Robert H., Jr.; Robertshaw, Harry H.
1991-01-01
Analytical and experimental procedures and design tools are presented for the control of flexible structure vibrations using a cantilevered adaptive truss. Simulated and experimental data are examined for three types of structures: a slender beam, a single curved beam, and two curved beams. The adaptive truss is shown to produce a 6,000-percent increase in damping, demonstrating its potential in vibration control. Good agreement is obtained between the simulated and experimental data, thus validating the modeling methods.
Nanoparticle exposure biomonitoring: exposure/effect indicator development approaches
NASA Astrophysics Data System (ADS)
Marie-Desvergne, C.; Dubosson, M.; Lacombe, M.; Brun, V.; Mossuz, V.
2015-05-01
The use of engineered nanoparticles (NP) is more and more widespread in various industrial sectors. The inhalation route of exposure is a matter of concern (adverse effects of air pollution by ultrafine particles and asbestos). No NP biomonitoring recommendations or standards are available so far. The LBM laboratory is currently studying several approaches to develop bioindicators for occupational health applications. As regards exposure indicators, new tools are being implemented to assess potentially inhaled NP in non-invasive respiratory sampling (nasal sampling and exhaled breath condensates (EBC)). Diverse NP analytical characterization methods are used (ICP-MS, dynamic light scattering and electron microscopy coupled to energy-dispersive X-ray analysis). As regards effect indicators, a methodology has been developed to assess a range of 29 cytokines in EBCs (potential respiratory inflammation due to NP exposure). Secondly, collaboration between the LBM laboratory and the EDyp team has allowed the EBC proteome to be characterized by means of an LC-MS/MS process. These projects are expected to facilitate the development of individual NP exposure biomonitoring tools and the analysis of early potential impacts on health. Innovative techniques such as field-flow fractionation combined with ICP-MS and single particle-ICPMS are currently being explored. These tools are directly intended to assist occupational physicians in the identification of exposure situations.
A comparative assessment of tools for ecosystem services quantification and valuation
Bagstad, Kenneth J.; Semmens, Darius; Waage, Sissel; Winthrop, Robert
2013-01-01
To enter widespread use, ecosystem service assessments need to be quantifiable, replicable, credible, flexible, and affordable. With recent growth in the field of ecosystem services, a variety of decision-support tools has emerged to support more systematic ecosystem services assessment. Despite the growing complexity of the tool landscape, thorough reviews of tools for identifying, assessing, modeling and in some cases monetarily valuing ecosystem services have generally been lacking. In this study, we describe 17 ecosystem services tools and rate their performance against eight evaluative criteria that gauge their readiness for widespread application in public- and private-sector decision making. We describe each of the tools′ intended uses, services modeled, analytical approaches, data requirements, and outputs, as well time requirements to run seven tools in a first comparative concurrent application of multiple tools to a common location – the San Pedro River watershed in southeast Arizona, USA, and northern Sonora, Mexico. Based on this work, we offer conclusions about these tools′ current ‘readiness’ for widespread application within both public- and private-sector decision making processes. Finally, we describe potential pathways forward to reduce the resource requirements for running ecosystem services models, which are essential to facilitate their more widespread use in environmental decision making.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Kuby, Candace R.
2014-01-01
An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…
Analytical Tools for Affordability Analysis
2015-05-01
function (Womer) Unit cost as a function of learning and rate Learning with forgetting (Benkard) Learning depreciates over time Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION
Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft
2013-03-01
imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89 B. FUTURE WORK................................................................................. 90 APPENDIX A. STK DATA AND BENEFIT
ERIC Educational Resources Information Center
Kilpatrick, Sue; Field, John; Falk, Ian
The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…
ERIC Educational Resources Information Center
Paranosic, Nikola; Riveros, Augusto
2017-01-01
This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…
Analytical Tools for Behavioral Influences Operations
2003-12-01
NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing
Progress in the development and integration of fluid flow control tools in paper microfluidics.
Fu, Elain; Downs, Corey
2017-02-14
Paper microfluidics is a rapidly growing subfield of microfluidics in which paper-like porous materials are used to create analytical devices. There is a need for higher performance field-use tests for many application domains including human disease diagnosis, environmental monitoring, and veterinary medicine. A key factor in creating high performance paper-based devices is the ability to manipulate fluid flow within the devices. This critical review is focused on the progress that has been made in (i) the development of fluid flow control tools and (ii) the integration of those tools into paper microfluidic devices. Further, we strive to be comprehensive in our presentation and provide historical context through discussion and performance comparisons, when possible, of both relevant earlier work and recent work. Finally, we discuss the major areas of focus for fluid flow methods development to advance the potential of paper microfluidics for high-performance field applications.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.
Youn-Ah Kang; Görg, Carsten; Stasko, John
2011-05-01
Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.
Microscale technology and biocatalytic processes: opportunities and challenges for synthesis.
Wohlgemuth, Roland; Plazl, Igor; Žnidaršič-Plazl, Polona; Gernaey, Krist V; Woodley, John M
2015-05-01
Despite the expanding presence of microscale technology in chemical synthesis and energy production as well as in biomedical devices and analytical and diagnostic tools, its potential in biocatalytic processes for pharmaceutical and fine chemicals, as well as related industries, has not yet been fully exploited. The aim of this review is to shed light on the strategic advantages of this promising technology for the development and realization of biocatalytic processes and subsequent product recovery steps, demonstrated with examples from the literature. Constraints, opportunities, and the future outlook for the implementation of these key green engineering methods and the role of supporting tools such as mathematical models to establish sustainable production processes are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
CIMAROSTI, HELENA; HENLEY, JEREMY M.
2012-01-01
It is well established that brain ischemia can cause neuronal death via different signaling cascades. The relative importance and interrelationships between these pathways, however, remain poorly understood. Here is presented an overview of studies using oxygen-glucose deprivation of organotypic hippocampal slice cultures to investigate the molecular mechanisms involved in ischemia. The culturing techniques, setup of the oxygen-glucose deprivation model, and analytical tools are reviewed. The authors focus on SUMOylation, a posttranslational protein modification that has recently been implicated in ischemia from whole animal studies as an example of how these powerful tools can be applied and could be of interest to investigate the molecular pathways underlying ischemic cell death. PMID:19029060
Use of big data for drug development and for public and personal health and care.
Leyens, Lada; Reumann, Matthias; Malats, Nuria; Brand, Angela
2017-01-01
The use of data analytics across the entire healthcare value chain, from drug discovery and development through epidemiology to informed clinical decision for patients or policy making for public health, has seen an explosion in the recent years. The increase in quantity and variety of data available together with the improvement of storing capabilities and analytical tools offer numerous possibilities to all stakeholders (manufacturers, regulators, payers, healthcare providers, decision makers, researchers) but most importantly, it has the potential to improve general health outcomes if we learn how to exploit it in the right way. This article looks at the different sources of data and the importance of unstructured data. It goes on to summarize current and potential future uses in drug discovery, development, and monitoring as well as in public and personal healthcare; including examples of good practice and recent developments. Finally, we discuss the main practical and ethical challenges to unravel the full potential of big data in healthcare and conclude that all stakeholders need to work together towards the common goal of making sense of the available data for the common good. © 2016 WILEY PERIODICALS, INC.
Ruptured thought: rupture as a critical attitude to nursing research.
Beedholm, Kirsten; Lomborg, Kirsten; Frederiksen, Kirsten
2014-04-01
In this paper, we introduce the notion of ‘rupture’ from the French philosopher Michel Foucault, whose studies of discourse and governmentality have become prominent within nursing research during the last 25 years. We argue that a rupture perspective can be helpful for identifying and maintaining a critical potential within nursing research. The paper begins by introducing rupture as an inheritance from the French epistemological tradition. It then describes how rupture appears in Foucault's works, as both an overall philosophical approach and as an analytic tool in his historical studies. Two examples of analytical applications of rupture are elaborated. In the first example, rupture has inspired us to make an effort to seek alternatives to mainstream conceptions of the phenomenon under study. In the second example, inspired by Foucault's work on discontinuity, we construct a framework for historical epochs in nursing history. The paper concludes by discussing the potential of the notion of rupture as a response to the methodological concerns regarding the use of Foucault-inspired discourse analysis within nursing research. We agree with the critique of Cheek that the critical potential of discourse analysis is at risk of being undermined by research that tends to convert the approach into a fixed method.
BMDExpress Data Viewer: A Visualization Tool to Analyze ...
Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
INVESTIGATION OF CE/LIF AS A TOOL IN THE ...
The investigation of emerging contaminant issues is a proactive effort in environmental analysis. As a part of this effort, sewage effluent is of current analytical interest because of the presence of pharmaceuticals and their metabolites and personal care products The environmental impact of these components is still under investigation but their constant perfusion into receiving waters and their potential effect on biota is of concern. This paper examines a tool for the characterization of sewage effluent using capillary electrophoresis/laser-induced fluorescence (CE/LIF) with a frequency-doubled laser operated in the ultraviolet (UV). Fluorescent acidic analytes are targeted because they present special problems for techniques such as gas chromatography/mass spectrometry (GC/MS) but are readily accessible to CE/LIF. As an example of the application of this tool, salicylic acid is determined near the 100 ng/L level in sewage effluent. Salicylic acid is a metabolite of various analgesics Relatively stable in the environment, it is a common contaminant of municipal sewage systems. Salicylic acid was recovered from freshly collected samples of the effluent by liquid-liquid extraction as part of a broad characterization effort. Confirmation of identity was by electron ionization GC/MS after conversion of the salicylic acid to the methyl ester by means of trimethylsilyidiazomethane CE/LIF in the UV has revealed more than 50 individual peaks in the extract and a bac
ERIC Educational Resources Information Center
Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie
2016-01-01
Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
NASA Astrophysics Data System (ADS)
Ki, Seo Jin; Ray, Chittaranjan
2015-03-01
A regional screening tool-which is useful in cases where few site-specific parameters are available for complex vadose zone models-assesses the leaching potential of pollutants to groundwater over large areas. In this study, the previous pesticide leaching tool used in Hawaii was revised to account for the release of new volatile organic compounds (VOCs) from the soil surface. The tool was modified to introduce expanded terms in the traditional pesticide ranking indices (i.e., retardation and attenuation factors), allowing the estimation of the leaching fraction of volatile chemicals based on recharge, soil, and chemical properties to be updated. Results showed that the previous tool significantly overestimated the mass fraction of VOCs leached through soils as the recharge rates increased above 0.001801 m/d. In contrast, the revised tool successfully delineated vulnerable areas to the selected VOCs based on two reference chemicals, a known leacher and non-leacher, which were determined in local conditions. The sensitivity analysis with the Latin-Hypercube-One-factor-At-a-Time method revealed that the new leaching tool was most sensitive to changes in the soil organic carbon sorption coefficient, fractional organic carbon content, and Henry's law constant; and least sensitive to parameters such as the bulk density, water content at field capacity, and particle density in soils. When the revised tool was compared to the analytical (STANMOD) and numerical (HYDRUS-1D) models as a susceptibility measure, it ranked particular VOCs well (e.g., benzene, carbofuran, and toluene) that were consistent with other two models under the given conditions. Therefore, the new leaching tool can be widely used to address intrinsic groundwater vulnerability to contamination of pesticides and VOCs, along with the DRASTIC method or similar Tier 1 models such as SCI-GROW and WIN-PST.
Data Standards for Flow Cytometry
SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.
2009-01-01
Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228
Sustainability Tools Inventory Initial Gap Analysis
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Marek, Lukáš; Tuček, Pavel; Pászto, Vít
2015-01-28
Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.
ERIC Educational Resources Information Center
Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon
2017-01-01
The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…
Visual Information for the Desktop, version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-03-29
VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J
2011-11-01
This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
Analytical and Experimental Investigation of Process Loads on Incremental Severe Plastic Deformation
NASA Astrophysics Data System (ADS)
Okan Görtan, Mehmet
2017-05-01
From the processing point of view, friction is a major problem in the severe plastic deformation (SPD) using equal channel angular pressing (ECAP) process. Incremental ECAP can be used in order to optimize frictional effects during SPD. A new incremental ECAP has been proposed recently. This new process called as equal channel angular swaging (ECAS) combines the conventional ECAP and the incremental bulk metal forming method rotary swaging. ECAS tool system consists of two dies with an angled channel that contains two shear zones. During ECAS process, two forming tool halves, which are concentrically arranged around the workpiece, perform high frequency radial movements with short strokes, while samples are pushed through these. The oscillation direction nearly coincides with the shearing direction in the workpiece. The most important advantages in comparison to conventional ECAP are a significant reduction in the forces in material feeding direction plus the potential to be extended to continuous processing. In the current study, the mechanics of the ECAS process is investigated using slip line field approach. An analytical model is developed to predict process loads. The proposed model is validated using experiments and FE simulations.
Jeong, Dahye; Kim, Jinsik; Chae, Myung-Sic; Lee, Wonseok; Yang, Seung-Hoon; Kim, YoungSoo; Kim, Seung Min; Lee, Jin San; Lee, Jeong Hoon; Choi, Jungkyu; Yoon, Dae Sung; Hwang, Kyo Seon
2018-05-28
Determination of the conformation (monomer, oligomer, or fibril) of amyloid peptide aggregates in the human brain is essential for the diagnosis and treatment of Alzheimer's disease (AD). Accordingly, systematic investigation of amyloid conformation using analytical tools is essential for precisely quantifying the relative amounts of the three conformations of amyloid peptide. Here, we developed a reduced graphene oxide (rGO) based multiplexing biosensor that could be used to monitor the relative amounts of the three conformations of various amyloid-β 40 (Aβ40) fluids. The electrical rGO biosensor was composed of a multichannel sensor array capable of individual detection of monomers, oligomers, and fibrils in a single amyloid fluid sample. From the performance test of each sensor, we showed that this method had good analytical sensitivity (1 pg/mL) and a fairly wide dynamic range (1 pg/mL to 10 ng/mL) for each conformation of Aβ40. To verify whether the rGO biosensor could be used to evaluate the relative amounts of the three conformations, various amyloid solutions (monomeric Aβ40, aggregated Aβ40, and disaggregated Aβ40 solutions) were employed. Notably, different trends in the relative amounts of the three conformations were observed in each amyloid solution, indicating that this information could serve as an important parameter in the clinical setting. Accordingly, our analytical tool could precisely detect the relative amounts of the three conformations of Aβ40 and may have potential applications as a diagnostic system for AD.
Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul
2016-01-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257
ERIC Educational Resources Information Center
Chen, Bodong
2015-01-01
In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…
Meyer, Adrian; Green, Laura; Faulk, Ciearro; Galla, Stephen; Meyer, Anne-Marie
2016-01-01
Introduction: Large amounts of health data generated by a wide range of health care applications across a variety of systems have the potential to offer valuable insight into populations and health care systems, but robust and secure computing and analytic systems are required to leverage this information. Framework: We discuss our experiences deploying a Secure Data Analysis Platform (SeDAP), and provide a framework to plan, build and deploy a virtual desktop infrastructure (VDI) to enable innovation, collaboration and operate within academic funding structures. It outlines 6 core components: Security, Ease of Access, Performance, Cost, Tools, and Training. Conclusion: A platform like SeDAP is not simply successful through technical excellence and performance. It’s adoption is dependent on a collaborative environment where researchers and users plan and evaluate the requirements of all aspects. PMID:27683665
NASA Astrophysics Data System (ADS)
Jumper, William D.
2012-03-01
Many high school and introductory college physics courses make use of mousetrap car projects and competitions as a way of providing an engaging hands-on learning experience incorporating Newton's laws, conversion of potential to kinetic energy, dissipative forces, and rotational mechanics. Presented here is a simple analytical and finite element spreadsheet model for a typical mousetrap car, as shown in Fig. 1. It is hoped that the model will provide students with a tool for designing or modifying the designs of their cars, provide instructors with a means to insure students close the loop between physical principles and an understanding of their car's speed and distance performance, and, third, stimulate in students at an early stage an appreciation for the merits of computer modeling as an aid in understanding and tackling otherwise analytically intractable problems so common in today's professional world.
Yan, Shuai; Cui, Sishan; Ke, Kun; Zhao, Bixing; Liu, Xiaolong; Yue, Shuhua; Wang, Ping
2018-06-05
Lipid metabolism is dysregulated in human cancers. The analytical tools that could identify and quantitatively map metabolites in unprocessed human tissues with submicrometer resolution are highly desired. Here, we implemented analytical hyperspectral stimulated Raman scattering microscopy to map the lipid metabolites in situ in normal and cancerous liver tissues from 24 patients. In contrast to the conventional wisdom that unsaturated lipid accumulation enhances tumor cell survival and proliferation, we unexpectedly visualized substantial amount of saturated fat accumulated in cancerous liver tissues, which was not seen in majority of their adjacent normal tissues. Further analysis by mass spectrometry confirmed significant high levels of glyceryl tripalmitate specifically in cancerous liver. These findings suggest that the aberrantly accumulated saturated fat may have great potential to be a metabolic biomarker for liver cancer.
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
IBM’s Health Analytics and Clinical Decision Support
Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.
2014-01-01
Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736
Rooftop solar photovoltaic potential in cities: how scalable are assessment approaches?
NASA Astrophysics Data System (ADS)
Castellanos, Sergio; Sunter, Deborah A.; Kammen, Daniel M.
2017-12-01
Distributed photovoltaics (PV) have played a critical role in the deployment of solar energy, currently making up roughly half of the global PV installed capacity. However, there remains significant unused economically beneficial potential. Estimates of the total technical potential for rooftop PV systems in the United States calculate a generation comparable to approximately 40% of the 2016 total national electric-sector sales. To best take advantage of the rooftop PV potential, effective analytic tools that support deployment strategies and aggressive local, state, and national policies to reduce the soft cost of solar energy are vital. A key step is the low-cost automation of data analysis and business case presentation for structure-integrated solar energy. In this paper, the scalability and resolution of various methods to assess the urban rooftop PV potential are compared, concluding with suggestions for future work in bridging methodologies to better assist policy makers.
Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio
2015-01-01
Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
Olawoyin, Richard
2016-10-01
The backpropagation (BP) artificial neural network (ANN) is a renowned and extensively functional mathematical tool used for time-series predictions and approximations; which also define results for non-linear functions. ANNs are vital tools in the predictions of toxicant levels, such as polycyclic aromatic hydrocarbons (PAH) potentially derived from anthropogenic activities in the microenvironment. In the present work, BP ANN was used as a prediction tool to study the potential toxicity of PAH carcinogens (PAHcarc) in soils. Soil samples (16 × 4 = 64) were collected from locations in South-southern Nigeria. The concentration of PAHcarc in laboratory cultivated white melilot, Melilotus alba roots grown on treated soils was predicted using ANN model training. Results indicated the Levenberg-Marquardt back-propagation training algorithm converged in 2.5E+04 epochs at an average RMSE value of 1.06E-06. The averagedR(2) comparison between the measured and predicted outputs was 0.9994. It may be deduced from this study that, analytical processes involving environmental risk assessment as used in this study can successfully provide prompt prediction and source identification of major soil toxicants. Copyright © 2016 Elsevier Ltd. All rights reserved.
Anandakrishnan, Ramu; Scogland, Tom R W; Fenley, Andrew T; Gordon, John C; Feng, Wu-chun; Onufriev, Alexey V
2010-06-01
Tools that compute and visualize biomolecular electrostatic surface potential have been used extensively for studying biomolecular function. However, determining the surface potential for large biomolecules on a typical desktop computer can take days or longer using currently available tools and methods. Two commonly used techniques to speed-up these types of electrostatic computations are approximations based on multi-scale coarse-graining and parallelization across multiple processors. This paper demonstrates that for the computation of electrostatic surface potential, these two techniques can be combined to deliver significantly greater speed-up than either one separately, something that is in general not always possible. Specifically, the electrostatic potential computation, using an analytical linearized Poisson-Boltzmann (ALPB) method, is approximated using the hierarchical charge partitioning (HCP) multi-scale method, and parallelized on an ATI Radeon 4870 graphical processing unit (GPU). The implementation delivers a combined 934-fold speed-up for a 476,040 atom viral capsid, compared to an equivalent non-parallel implementation on an Intel E6550 CPU without the approximation. This speed-up is significantly greater than the 42-fold speed-up for the HCP approximation alone or the 182-fold speed-up for the GPU alone. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L
2018-05-31
Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona
2018-05-01
The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.
Himes, Sarah K; Scheidweiler, Karl B; Tassiopoulos, Katherine; Kacanek, Deborah; Hazra, Rohan; Rich, Kenneth; Huestis, Marilyn A
2013-02-05
A novel method for the simultaneous quantification of 16 antiretroviral (ARV) drugs and 4 metabolites in meconium was developed and validated. Quantification of 6 nucleoside/nucleotide reverse transcriptase inhibitors, 2 non-nucleoside reverse transcriptase inhibitors, 7 protease inhibitors, and 1 integrase inhibitor was achieved in 0.25 g of meconium. Specimen preparation included methanol homogenization and solid-phase extraction. Separate positive and negative polarity multiple reaction monitoring mode injections were required to achieve sufficient sensitivity. Linearity ranged from 10 to 75 ng/g up to 2500 ng/g for most analytes and 100-500 ng/g up to 25,000 ng/g for some; all correlation coefficients were ≥0.99. Extraction efficiencies from meconium were 32.8-119.5% with analytical recovery of 80.3-108.3% and total imprecision of 2.2-11.0% for all quantitative analytes. Two analytes with analytical recovery (70.0-138.5%) falling outside the 80-120% criteria range were considered semiquantitative. Matrix effects were -98.3-47.0% and -98.0-67.2% for analytes and internal standards, respectively. Analytes were stable (>75%) at room temperature for 24 h, 4 °C for 3 days, -20 °C for 3 freeze-thaw cycles over 3 days, and on the autosampler. Method applicability was demonstrated by analyzing meconium from HIV-uninfected infants born to HIV-positive mothers on ARV therapy. This method can be used as a tool to investigate the potential effects of in utero ARV exposure on childhood health and neurodevelopmental outcomes.
Graphical Descriptives: A Way to Improve Data Transparency and Methodological Rigor in Psychology.
Tay, Louis; Parrigon, Scott; Huang, Qiming; LeBreton, James M
2016-09-01
Several calls have recently been issued to the social sciences for enhanced transparency of research processes and enhanced rigor in the methodological treatment of data and data analytics. We propose the use of graphical descriptives (GDs) as one mechanism for responding to both of these calls. GDs provide a way to visually examine data. They serve as quick and efficient tools for checking data distributions, variable relations, and the potential appropriateness of different statistical analyses (e.g., do data meet the minimum assumptions for a particular analytic method). Consequently, we believe that GDs can promote increased transparency in the journal review process, encourage best practices for data analysis, and promote a more inductive approach to understanding psychological data. We illustrate the value of potentially including GDs as a step in the peer-review process and provide a user-friendly online resource (www.graphicaldescriptives.org) for researchers interested in including data visualizations in their research. We conclude with suggestions on how GDs can be expanded and developed to enhance transparency. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
De Laurentis, Mariafelicia; De Martino, Ivan; Lazkoz, Ruth
2018-05-01
Alternative theories of gravity may serve to overcome several shortcomings of the standard cosmological model but, in their weak field limit, general relativity must be recovered so as to match the tight constraints at the Solar System scale. Therefore, testing such alternative models at scales of stellar systems could give a unique opportunity to confirm or rule them out. One of the most straightforward modifications is represented by analytical f (R )-gravity models that introduce a Yukawa-like modification to the Newtonian potential thus modifying the dynamics of particles. Using the geodesics equations, we have illustrated the amplitude of these modifications. First, we have integrated numerically the equations of motion showing the orbital precession of a particle around a massive object. Second, we have computed an analytic expression for the periastron advance of systems having their semimajor axis much shorter than the Yukawa-scale length. Finally, we have extended our results to the case of a binary system composed of two massive objects. Our analysis provides a powerful tool to obtain constraints on the underlying theory of gravity using current and forthcoming data sets.
Haze Gray Paint and the U.S. Navy: A Procurement Process Review
2017-12-01
support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools
2014-01-14
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and
Visualization and Analytics Software Tools for Peregrine System |
R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel
2011-03-28
Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory
Demonstrating Success: Web Analytics and Continuous Improvement
ERIC Educational Resources Information Center
Loftus, Wayne
2012-01-01
As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, F.H.; Borek, T.T.; Christopher, J.Z.
1997-12-01
Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less
Distributed Generation Interconnection Collaborative | NREL
, reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
On the Application of Euler Deconvolution to the Analytic Signal
NASA Astrophysics Data System (ADS)
Fedi, M.; Florio, G.; Pasteka, R.
2005-05-01
In the last years papers on Euler deconvolution (ED) used formulations that accounted for the unknown background field, allowing to consider the structural index (N) an unknown to be solved for, together with the source coordinates. Among them, Hsu (2002) and Fedi and Florio (2002) independently pointed out that the use of an adequate m-order derivative of the field, instead than the field itself, allowed solving for both N and source position. For the same reason, Keating and Pilkington (2004) proposed the ED of the analytic signal. A function being analyzed by ED must be homogeneous but also harmonic, because it must be possible to compute its vertical derivative, as well known from potential field theory. Huang et al. (1995), demonstrated that analytic signal is a homogeneous function, but, for instance, it is rather obvious that the magnetic field modulus (corresponding to the analytic signal of a gravity field) is not a harmonic function (e.g.: Grant & West, 1965). Thus, it appears that a straightforward application of ED to the analytic signal is not possible because a vertical derivation of this function is not correct by using standard potential fields analysis tools. In this note we want to theoretically and empirically check what kind of error are caused in the ED by such wrong assumption about analytic signal harmonicity. We will discuss results on profile and map synthetic data, and use a simple method to compute the vertical derivative of non-harmonic functions measured on a horizontal plane. Our main conclusions are: 1. To approximate a correct evaluation of the vertical derivative of a non-harmonic function it is useful to compute it with finite-difference, by using upward continuation. 2. We found that the errors on the vertical derivative computed as if the analytic signal was harmonic reflects mainly on the structural index estimate; these errors can mislead an interpretation even though the depth estimates are almost correct. 3. Consistent estimates of depth and S.I. are instead obtained by using a finite-difference vertical derivative of the analytic signal. 4. Analysis of a case history confirms the strong error in the estimation of structural index if the analytic signal is treated as an harmonic function.
Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-03-01
A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.
State-of-the-art of bone marrow analysis in forensic toxicology: a review.
Cartiser, Nathalie; Bévalot, Fabien; Fanton, Laurent; Gaillard, Yvan; Guitton, Jérôme
2011-03-01
Although blood is the reference medium in the field of forensic toxicology, alternative matrices are required in case of limited, unavailable or unusable blood samples. The present review investigated the suitability of bone marrow (BM) as an alternative matrix to characterize xenobiotic consumption and its influence on the occurrence of death. Basic data on BM physiology are reported in order to highlight the specificities of this matrix and their analytical and toxicokinetic consequences. A review of case reports, animal and human studies involving BM sample analysis focuses on the various parameters of interpretation of toxicological results: analytic limits, sampling location, pharmacokinetics, blood/BM concentration correlation, stability and postmortem redistribution. Tables summarizing the analytical conditions and quantification of 45 compounds from BM samples provide a useful tool for toxicologists. A specific section devoted to ethanol shows that, despite successful quantification, interpretation is highly dependent on postmortem interval. In conclusion, BM is an interesting alternative matrix, and further experimental data and validated assays are required to confirm its great potential relevance in forensic toxicology.
Phosphorescent nanosensors for in vivo tracking of histamine levels.
Cash, Kevin J; Clark, Heather A
2013-07-02
Continuously tracking bioanalytes in vivo will enable clinicians and researchers to profile normal physiology and monitor diseased states. Current in vivo monitoring system designs are limited by invasive implantation procedures and biofouling, limiting the utility of these tools for obtaining physiologic data. In this work, we demonstrate the first success in optically tracking histamine levels in vivo using a modular, injectable sensing platform based on diamine oxidase and a phosphorescent oxygen nanosensor. Our new approach increases the range of measurable analytes by combining an enzymatic recognition element with a reversible nanosensor capable of measuring the effects of enzymatic activity. We use these enzyme nanosensors (EnzNS) to monitor the in vivo histamine dynamics as the concentration rapidly increases and decreases due to administration and clearance. The EnzNS system measured kinetics that match those reported from ex vivo measurements. This work establishes a modular approach to in vivo nanosensor design for measuring a broad range of potential target analytes. Simply replacing the recognition enzyme, or both the enzyme and nanosensor, can produce a new sensor system capable of measuring a wide range of specific analytical targets in vivo.
Tobiszewski, Marek; Orłowski, Aleksander
2015-03-27
The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick
2017-02-01
Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
Grady, Julia
2010-01-01
Teaching by inquiry is touted for its potential to encourage students to reason scientifically. Yet, even when inquiry teaching is practiced, complexity of students' reasoning may be limited or unbalanced. We describe an analytic tool for recognizing when students are engaged in complex reasoning during inquiry teaching. Using classrooms that represented “best case scenarios” for inquiry teaching, we adapted and applied a matrix to categorize the complexity of students' reasoning. Our results revealed points when students' reasoning was quite complex and occasions when their reasoning was limited by the curriculum, instructional choices, or students' unprompted prescription. We propose that teachers use the matrix as a springboard for reflection and discussion that takes a sustained, critical view of inquiry teaching practice. PMID:21113314
A Noise and Emissions Assessment of the N3-X Transport
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Haller, William J.
2014-01-01
Analytical predictions of certification noise and exhaust emissions for NASA's N3-X - a notional, hybrid wingbody airplane - are presented in this paper. The N3-X is a 300-passenger concept transport propelled by an array of fans distributed spanwise near the trailing edge of the wingbody. These fans are driven by electric motors deriving power from twin generators driven by turboshaft engines. Turboelectric distributed hybrid propulsion has the potential to dramatically increase the propulsive efficiency of aircraft. The noise and exhaust emission estimates presented here are generated using NASA's conceptual design systems analysis tools with several key modifications to accommodate this unconventional architecture. These tools predict certification noise and the emissions of oxides of nitrogen by leveraging data generated from a recent analysis of the N3-X propulsion system.
Space Shuttle/TDRSS communication and tracking systems analysis
NASA Astrophysics Data System (ADS)
Lindsey, W. C.; Chie, C. M.; Cideciyan, R.; Dessouky, K.; Su, Y. T.; Tsang, C. S.
1986-04-01
In order to evaluate the technical and operational problem areas and provide a recommendation, the enhancements to the Tracking and Data Delay Satellite System (TDRSS) and Shuttle must be evaluated through simulation and analysis. These enhancement techniques must first be characterized, then modeled mathematically, and finally updated into LinCsim (analytical simulation package). The LinCsim package can then be used as an evaluation tool. Three areas of potential enhancements were identified: shuttle payload accommodations, TDRSS SSA and KSA services, and shuttle tracking system and navigation sensors. Recommendations for each area were discussed.
Space Shuttle/TDRSS communication and tracking systems analysis
NASA Technical Reports Server (NTRS)
Lindsey, W. C.; Chie, C. M.; Cideciyan, R.; Dessouky, K.; Su, Y. T.; Tsang, C. S.
1986-01-01
In order to evaluate the technical and operational problem areas and provide a recommendation, the enhancements to the Tracking and Data Delay Satellite System (TDRSS) and Shuttle must be evaluated through simulation and analysis. These enhancement techniques must first be characterized, then modeled mathematically, and finally updated into LinCsim (analytical simulation package). The LinCsim package can then be used as an evaluation tool. Three areas of potential enhancements were identified: shuttle payload accommodations, TDRSS SSA and KSA services, and shuttle tracking system and navigation sensors. Recommendations for each area were discussed.
Optofluidic two-dimensional grating volume refractive index sensor.
Sarkar, Anirban; Shivakiran Bhaktha, B N; Khastgir, Sugata Pratik
2016-09-10
We present an optofluidic reservoir with a two-dimensional grating for a lab-on-a-chip volume refractive index sensor. The observed diffraction pattern from the device resembles the analytically obtained fringe pattern. The change in the diffraction pattern has been monitored in the far-field for fluids with different refractive indices. Reliable measurements of refractive index variations, with an accuracy of 6×10-3 refractive index units, for different fluids establishes the optofluidic device as a potential on-chip tool for monitoring dynamic refractive index changes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma M.; Hendrix, Val; Chertkov, Michael
This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less
NASA Astrophysics Data System (ADS)
Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei
2003-09-01
As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.
2014-10-20
three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY
Modeling of the Global Water Cycle - Analytical Models
Yongqiang Liu; Roni Avissar
2005-01-01
Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...
NASA Astrophysics Data System (ADS)
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-02-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-01-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.
Dunn, Joshua G; Weissman, Jonathan S
2016-11-22
Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .
2011-01-01
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968
Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin
2011-03-16
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.
Google Analytics – Index of Resources
Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.
Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S
2018-07-01
A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.
Wegner, S; Bauer, J I; Dietrich, R; Märtlbauer, E; Usleber, E; Gottschalk, C; Gross, M
2017-02-01
A simplified method to produce specific polyclonal rabbit antibodies against sterigmatocystin (STC) was established, using a STC-glycolic acid-ether derivative (STC-GE) conjugated to keyhole limpet haemocyanin (immunogen). The competitive direct enzyme immunoassay (EIA) established for STC had a detection limit (20% binding inhibition) of 130 pg ml -1 . The test was highly specific for STC, with minor cross-reactivity with O-methylsterigmatocystin (OMSTC, 0·87%) and negligible reactivity with aflatoxins (<0·02%). STC-EIA was used in combination with a previously developed specific EIA for aflatoxins (<0·1% cross-reactivity with STC and OMSTC), to study the STC/aflatoxin production profiles of reference strains of Aspergillus species. This immunochemotaxonomic procedure was found to be a convenient tool to identify STC- or aflatoxin-producing strains. The carcinogenic mycotoxin sterigmatocystin (STC) is produced by several Aspergillus species, either alone or together with aflatoxins. Here, we report a very simple and straightforward procedure to obtain highly sensitive and specific anti-STC antibodies, and their use in the first ever real STC-specific competitive direct enzyme immunoassay (EIA). In combination with a previous EIA for aflatoxins, this study for the first time demonstrates the potential of a STC/aflatoxin EIA pair for what is branded as 'immunochemotaxonomic' identification of mycotoxigenic Aspergillus species. This new analytical tool enhances analytical possibilities for differential analysis of STC and aflatoxins. © 2016 The Society for Applied Microbiology.
Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review
NASA Astrophysics Data System (ADS)
Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal
2017-08-01
Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
Metabolomics for laboratory diagnostics.
Bujak, Renata; Struck-Lewicka, Wiktoria; Markuszewski, Michał J; Kaliszan, Roman
2015-09-10
Metabolomics is an emerging approach in a systems biology field. Due to continuous development in advanced analytical techniques and in bioinformatics, metabolomics has been extensively applied as a novel, holistic diagnostic tool in clinical and biomedical studies. Metabolome's measurement, as a chemical reflection of a current phenotype of a particular biological system, is nowadays frequently implemented to understand pathophysiological processes involved in disease progression as well as to search for new diagnostic or prognostic biomarkers of various organism's disorders. In this review, we discussed the research strategies and analytical platforms commonly applied in the metabolomics studies. The applications of the metabolomics in laboratory diagnostics in the last 5 years were also reviewed according to the type of biological sample used in the metabolome's analysis. We also discussed some limitations and further improvements which should be considered taking in mind potential applications of metabolomic research and practice. Copyright © 2014 Elsevier B.V. All rights reserved.
Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.
Borowsky, Joseph; Collins, Greg E
2007-10-01
The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.
Buccolieri, Alessandro; Hasan, Mohammed; Bettini, Simona; Bonfrate, Valentina; Salvatore, Luca; Santino, Angelo; Borovkov, Victor; Giancane, Gabriele
2018-06-05
Conformational switching induced in ethane-bridged bisporphyrins was used as a sensitive transduction method for revealing the presence of urea dissolved in water via nonenzymatic approach. Bisporphyrins were deposited on solid quartz slides by means of the spin-coating method. Molecular conformations of Zn and Ni monometalated bis-porphyrins were influenced by water solvated urea molecules and their fluorescence emission was modulated by the urea concentration. Absorption, fluorescence and Raman spectroscopies allowed the identification of supramolecular processes, which are responsible for host-guest interaction between the active layers and urea molecules. A high selectivity of the sensing mechanism was highlighted upon testing the spectroscopic responses of bis-porphyrin films to citrulline and glutamine used as interfering agents. Additionally, potential applicability was demonstrated by quantifying the urea concentration in real physiological samples proposing this new approach as a valuable alternative analytical procedure to the traditionally used enzymatic methods.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Title: Experimental and analytical study of frictional anisotropy of nanotubes
NASA Astrophysics Data System (ADS)
Riedo, Elisa; Gao, Yang; Li, Tai-De; Chiu, Hsiang-Chih; Kim, Suenne; Klinke, Christian; Tosatti, Erio
The frictional properties of Carbon and Boron Nitride nanotubes (NTs) are very important in a variety of applications, including composite materials, carbon fibers, and micro/nano-electromechanical systems. Atomic force microscopy (AFM) is a powerful tool to investigate with nanoscale resolution the frictional properties of individual NTs. Here, we report on an experimental study of the frictional properties of different types of supported nanotubes by AFM. We also propose a quantitative model to describe and then predict the frictional properties of nanotubes sliding on a substrate along (longitudinal friction) or perpendicular (transverse friction) their axis. This model provides a simple but general analytical relationship that well describes the acquired experimental data. As an example of potential applications, this experimental method combined with the proposed model can guide to design better NTs-ceramic composites, or to self-assemble the nanotubes on a surface in a given direction. M. Lucas et al., Nature Materials 8, 876-881 (2009).
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Hyperspectral imaging for non-contact analysis of forensic traces.
Edelman, G J; Gaston, E; van Leeuwen, T G; Cullen, P J; Aalders, M C G
2012-11-30
Hyperspectral imaging (HSI) integrates conventional imaging and spectroscopy, to obtain both spatial and spectral information from a specimen. This technique enables investigators to analyze the chemical composition of traces and simultaneously visualize their spatial distribution. HSI offers significant potential for the detection, visualization, identification and age estimation of forensic traces. The rapid, non-destructive and non-contact features of HSI mark its suitability as an analytical tool for forensic science. This paper provides an overview of the principles, instrumentation and analytical techniques involved in hyperspectral imaging. We describe recent advances in HSI technology motivating forensic science applications, e.g. the development of portable and fast image acquisition systems. Reported forensic science applications are reviewed. Challenges are addressed, such as the analysis of traces on backgrounds encountered in casework, concluded by a summary of possible future applications. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Advances in Molecular Rotational Spectroscopy for Applied Science
NASA Astrophysics Data System (ADS)
Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.
2017-06-01
Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.
On pseudo-hyperkähler prepotentials
NASA Astrophysics Data System (ADS)
Devchand, Chandrashekar; Spiro, Andrea
2016-10-01
An explicit surjection from a set of (locally defined) unconstrained holomorphic functions on a certain submanifold of Sp1(ℂ) × ℂ4n onto the set HKp,q of local isometry classes of real analytic pseudo-hyperkähler metrics of signature (4p, 4q) in dimension 4n is constructed. The holomorphic functions, called prepotentials, are analogues of Kähler potentials for Kähler metrics and provide a complete parameterisation of HKp,q. In particular, there exists a bijection between HKp,q and the set of equivalence classes of prepotentials. This affords the explicit construction of pseudo-hyperkähler metrics from specified prepotentials. The construction generalises one due to Galperin, Ivanov, Ogievetsky, and Sokatchev. Their work is given a coordinate-free formulation and complete, self-contained proofs are provided. The Appendix provides a vital tool for this construction: a reformulation of real analytic G-structures in terms of holomorphic frame fields on complex manifolds.
Urban Health Indicator Tools of the Physical Environment: a Systematic Review.
Pineo, Helen; Glonti, Ketevan; Rutter, Harry; Zimmermann, Nici; Wilkinson, Paul; Davies, Michael
2018-04-16
Urban health indicator (UHI) tools provide evidence about the health impacts of the physical urban environment which can be used in built environment policy and decision-making. Where UHI tools provide data at the neighborhood (and lower) scale they can provide valuable information about health inequalities and environmental deprivation. This review performs a census of UHI tools and explores their nature and characteristics (including how they represent, simplify or address complex systems) to increase understanding of their potential use by municipal built environment policy and decision-makers. We searched seven bibliographic databases, four key journals and six practitioner websites and conducted Google searches between January 27, 2016 and February 24, 2016 for UHI tools. We extracted data from primary studies and online indicator systems. We included 198 documents which identified 145 UHI tools comprising 8006 indicators, from which we developed a taxonomy. Our taxonomy classifies the significant diversity of UHI tools with respect to topic, spatial scale, format, scope and purpose. The proportions of UHI tools which measure data at the neighborhood and lower scale, and present data via interactive maps, have both increased over time. This is particularly relevant to built environment policy and decision-makers, reflects growing analytical capability and offers the potential for improved understanding of the complexity of influences on urban health (an aspect noted as a particular challenge by some indicator producers). The relation between urban health indicators and health impacts attributable to modifiable environmental characteristics is often indirect. Furthermore, the use of UHI tools in policy and decision-making appears to be limited, thus raising questions about the continued development of such tools by multiple organisations duplicating scarce resources. Further research is needed to understand the requirements of built environment policy and decision-makers, public health professionals and local communities regarding the form and presentation of indicators which support their varied objectives.
Mullinx, Cassandra; Phillips, Scott; Shenk, Kelly; Hearn, Paul; Devereux, Olivia
2009-01-01
The Chesapeake Bay Program (CBP) is attempting to more strategically implement management actions to improve the health of the Nation’s largest estuary. In 2007 the U.S. Geological Survey (USGS) and U.S. Environmental Protection Agency (USEPA) CBP office began a joint effort to develop a suite of Internetaccessible decision-support tools and to help meet the needs of CBP partners to improve water quality and habitat conditions in the Chesapeake Bay and its watersheds. An adaptive management framework is being used to provide a structured decision process for information and individual tools needed to implement and assess practices to improve the condition of the Chesapeake Bay ecosystem. The Chesapeake Online Adaptive Support Toolkit (COAST) is a collection of web-based analytical tools and information, organized in an adaptive management framework, intended to aid decisionmakers in protecting and restoring the integrity of the Bay ecosystem. The initial version of COAST is focused on water quality issues. During early and mid- 2008, initial ideas for COAST were shared and discussed with various CBP partners and other potential user groups. At these meetings, test cases were selected to help improve understanding of the types of information and analytical functionality that would be most useful for specific partners’ needs. These discussions added considerable knowledge about the nature of decisionmaking for Federal, State, local and nongovernmental partners. Version 1.0 of COAST, released in early winter of 2008, will be further reviewed to determine improvements needed to address implementation and assessment of water quality practices. Future versions of COAST may address other aspects of ecosystem restoration, including restoration of habitat and living resources and maintaining watershed health.
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
Animal alternatives for whole effluent toxicity testing ...
Since the 1940s, effluent toxicity testing has been utilized to varying degrees in many countries to assess potential ecological impacts and assist in determining necessary treatment options for environmental protection. However, it was only in the early 1980’s that toxicity based effluent assessments and subsequent discharge controls became globally important, when it was recognized that physical and chemical measurements alone did not protect the environment from potential impacts. Consequently, various strategies using different toxicity tests, whole effluent assessment techniques (incorporating bioaccumulation potential and persistence) plus supporting analytical tools have been developed over 30 years of practice. Numerous workshops and meetings have focused on effluent risk assessment through ASTM, SETAC, OSPAR, UK competent authorities, and EU specific country rules. Concurrent with this drive to improve effluent quality using toxicity tests, interest in reducing animal use has risen. The Health and Environmental Sciences Institute (HESI) organized and facilitated an international workshop in March 2016 to evaluate strategies for concepts, tools, and effluent assessments and update the toolbox of for effluent testing methods. The workshop objectives were to identify opportunities to use a suite of strategies for effluents, and to identify opportunities to reduce the reliance on animal tests and to determine barriers to implementation of new methodologie
White, Alec F.; Head-Gordon, Martin; McCurdy, C. William
2017-01-30
The computation of Siegert energies by analytic continuation of bound state energies has recently been applied to shape resonances in polyatomic molecules by several authors. Here, we critically evaluate a recently proposed analytic continuation method based on low order (type III) Padé approximants as well as an analytic continuation method based on high order (type II) Padé approximants. We compare three classes of stabilizing potentials: Coulomb potentials, Gaussian potentials, and attenuated Coulomb potentials. These methods are applied to a model potential where the correct answer is known exactly and to the 2Π g shape resonance of N 2 - whichmore » has been studied extensively by other methods. Both the choice of stabilizing potential and method of analytic continuation prove to be important to the accuracy of the results. We then conclude that an attenuated Coulomb potential is the most effective of the three for bound state analytic continuation methods. With the proper potential, such methods show promise for algorithmic determination of the positions and widths of molecular shape resonances.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Alec F.; Head-Gordon, Martin; McCurdy, C. William
The computation of Siegert energies by analytic continuation of bound state energies has recently been applied to shape resonances in polyatomic molecules by several authors. Here, we critically evaluate a recently proposed analytic continuation method based on low order (type III) Padé approximants as well as an analytic continuation method based on high order (type II) Padé approximants. We compare three classes of stabilizing potentials: Coulomb potentials, Gaussian potentials, and attenuated Coulomb potentials. These methods are applied to a model potential where the correct answer is known exactly and to the 2Π g shape resonance of N 2 - whichmore » has been studied extensively by other methods. Both the choice of stabilizing potential and method of analytic continuation prove to be important to the accuracy of the results. We then conclude that an attenuated Coulomb potential is the most effective of the three for bound state analytic continuation methods. With the proper potential, such methods show promise for algorithmic determination of the positions and widths of molecular shape resonances.« less
Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs
ERIC Educational Resources Information Center
Veregin, Howard
2015-01-01
Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…
The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate
ERIC Educational Resources Information Center
Cárdenas-Navia, Isabel; Fitzgerald, Brian K.
2015-01-01
New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…
Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices.
Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam
2017-12-01
Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MS E mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. Graphical Abstract.
Using In Silico Fragmentation to Improve Routine Residue Screening in Complex Matrices
NASA Astrophysics Data System (ADS)
Kaufmann, Anton; Butcher, Patrick; Maden, Kathryn; Walker, Stephan; Widmer, Mirjam
2017-12-01
Targeted residue screening requires the use of reference substances in order to identify potential residues. This becomes a difficult issue when using multi-residue methods capable of analyzing several hundreds of analytes. Therefore, the capability of in silico fragmentation based on a structure database ("suspect screening") instead of physical reference substances for routine targeted residue screening was investigated. The detection of fragment ions that can be predicted or explained by in silico software was utilized to reduce the number of false positives. These "proof of principle" experiments were done with a tool that is integrated into a commercial MS vendor instrument operating software (UNIFI) as well as with a platform-independent MS tool (Mass Frontier). A total of 97 analytes belonging to different chemical families were separated by reversed phase liquid chromatography and detected in a data-independent acquisition (DIA) mode using ion mobility hyphenated with quadrupole time of flight mass spectrometry. The instrument was operated in the MSE mode with alternating low and high energy traces. The fragments observed from product ion spectra were investigated using a "chopping" bond disconnection algorithm and a rule-based algorithm. The bond disconnection algorithm clearly explained more analyte product ions and a greater percentage of the spectral abundance than the rule-based software (92 out of the 97 compounds produced ≥1 explainable fragment ions). On the other hand, tests with a complex blank matrix (bovine liver extract) indicated that the chopping algorithm reports significantly more false positive fragments than the rule based software. [Figure not available: see fulltext.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R
2016-12-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform
Poucke, Sven Van; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; Deyne, Cathy De
2016-01-01
With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner’s Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research. PMID:26731286
Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.
Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy
2016-01-01
With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.
NASA Astrophysics Data System (ADS)
Gradziński, Piotr
2017-10-01
Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.
Kumar, B Vinodh; Mohan, Thuthi
2018-01-01
Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.
Network analysis applications in hydrology
NASA Astrophysics Data System (ADS)
Price, Katie
2017-04-01
Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain underexplored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five longterm USGS streamflow and water quality gages, allowing network application of longterm flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long term and eventbased hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwatersurface water interactions.
Galatzer-Levy, I R; Ma, S; Statnikov, A; Yehuda, R; Shalev, A Y
2017-01-01
To date, studies of biological risk factors have revealed inconsistent relationships with subsequent post-traumatic stress disorder (PTSD). The inconsistent signal may reflect the use of data analytic tools that are ill equipped for modeling the complex interactions between biological and environmental factors that underlay post-traumatic psychopathology. Further, using symptom-based diagnostic status as the group outcome overlooks the inherent heterogeneity of PTSD, potentially contributing to failures to replicate. To examine the potential yield of novel analytic tools, we reanalyzed data from a large longitudinal study of individuals identified following trauma in the general emergency room (ER) that failed to find a linear association between cortisol response to traumatic events and subsequent PTSD. First, latent growth mixture modeling empirically identified trajectories of post-traumatic symptoms, which then were used as the study outcome. Next, support vector machines with feature selection identified sets of features with stable predictive accuracy and built robust classifiers of trajectory membership (area under the receiver operator characteristic curve (AUC)=0.82 (95% confidence interval (CI)=0.80–0.85)) that combined clinical, neuroendocrine, psychophysiological and demographic information. Finally, graph induction algorithms revealed a unique path from childhood trauma via lower cortisol during ER admission, to non-remitting PTSD. Traditional general linear modeling methods then confirmed the newly revealed association, thereby delineating a specific target population for early endocrine interventions. Advanced computational approaches offer innovative ways for uncovering clinically significant, non-shared biological signals in heterogeneous samples. PMID:28323285
Galatzer-Levy, I R; Ma, S; Statnikov, A; Yehuda, R; Shalev, A Y
2017-03-21
To date, studies of biological risk factors have revealed inconsistent relationships with subsequent post-traumatic stress disorder (PTSD). The inconsistent signal may reflect the use of data analytic tools that are ill equipped for modeling the complex interactions between biological and environmental factors that underlay post-traumatic psychopathology. Further, using symptom-based diagnostic status as the group outcome overlooks the inherent heterogeneity of PTSD, potentially contributing to failures to replicate. To examine the potential yield of novel analytic tools, we reanalyzed data from a large longitudinal study of individuals identified following trauma in the general emergency room (ER) that failed to find a linear association between cortisol response to traumatic events and subsequent PTSD. First, latent growth mixture modeling empirically identified trajectories of post-traumatic symptoms, which then were used as the study outcome. Next, support vector machines with feature selection identified sets of features with stable predictive accuracy and built robust classifiers of trajectory membership (area under the receiver operator characteristic curve (AUC)=0.82 (95% confidence interval (CI)=0.80-0.85)) that combined clinical, neuroendocrine, psychophysiological and demographic information. Finally, graph induction algorithms revealed a unique path from childhood trauma via lower cortisol during ER admission, to non-remitting PTSD. Traditional general linear modeling methods then confirmed the newly revealed association, thereby delineating a specific target population for early endocrine interventions. Advanced computational approaches offer innovative ways for uncovering clinically significant, non-shared biological signals in heterogeneous samples.
Rowlinson, Steve; Jia, Yunyan Andrea
2014-04-01
Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.
NASA Astrophysics Data System (ADS)
Fitkov-Norris, Elena; Yeghiazarian, Ara
2016-11-01
The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.
Analytical considerations for mass spectrometry profiling in serum biomarker discovery.
Whiteley, Gordon R; Colantonio, Simona; Sacconi, Andrea; Saul, Richard G
2009-03-01
The potential of using mass spectrometry profiling as a diagnostic tool has been demonstrated for a wide variety of diseases. Various cancers and cancer-related diseases have been the focus of much of this work because of both the paucity of good diagnostic markers and the knowledge that early diagnosis is the most powerful weapon in treating cancer. The implementation of mass spectrometry as a routine diagnostic tool has proved to be difficult, however, primarily because of the stringent controls that are required for the method to be reproducible. The method is evolving as a powerful guide to the discovery of biomarkers that could, in turn, be used either individually or in an array or panel of tests for early disease detection. Using proteomic patterns to guide biomarker discovery and the possibility of deployment in the clinical laboratory environment on current instrumentation or in a hybrid technology has the possibility of being the early diagnosis tool that is needed.
Development of the electric vehicle analyzer
NASA Astrophysics Data System (ADS)
Dickey, Michael R.; Klucz, Raymond S.; Ennix, Kimberly A.; Matuszak, Leo M.
1990-06-01
The increasing technological maturity of high power (greater than 20 kW) electric propulsion devices has led to renewed interest in their use as a means of efficiently transferring payloads between earth orbits. Several systems and architecture studies have identified the potential cost benefits of high performance Electric Orbital Transfer Vehicles (EOTVs). These studies led to the initiation of the Electric Insertion Transfer Experiment (ELITE) in 1988. Managed by the Astronautics Laboratory, ELITE is a flight experiment designed to sufficiently demonstrate key technologies and options to pave the way for the full-scale development of an operational EOTV. An important consideration in the development of the ELITE program is the capability of available analytical tools to simulate the orbital mechanics of a low thrust, electric propulsion transfer vehicle. These tools are necessary not only for ELITE mission planning exercises but also for continued, efficient, accurate evaluation of DoD space transportation architectures which include EOTVs. This paper presents such a tool: the Electric Vehicle Analyzer (EVA).
The application of computer-based tools in obtaining the genetic family history.
Giovanni, Monica A; Murray, Michael F
2010-07-01
Family health history is both an adjunct to and a focus of current genetic research, having long been known to be a powerful predictor of individual disease risk. As such, it has been primarily used as a proxy for genetic information. Over the past decade, new roles for family history have emerged, perhaps most importantly as a primary tool for guiding decision-making on the use of expensive genetic testing. The collection of family history information is an important but time-consuming process. Efforts to engage the patient or research subject in preliminary data collection have the potential to improve data accuracy and allow clinicians and researchers more time for analytic tasks. The U.S. Surgeon General, the Centers for Disease Control and Prevention (CDC), and others have developed tools for electronic family history collection. This unit describes the utility of the Web-based My Family Health Portrait (https://familyhistory.hhs.gov) as the prototype for patient-entered family history.
Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine
Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit
2017-01-01
In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804
Nano-plasmonic exosome diagnostics
Im, Hyungsoon; Shao, Huilin; Weissleder, Ralph; Castro, Cesar M.; Lee, Hakho
2015-01-01
Exosomes have emerged as a promising biomarker. These vesicles abound in biofluids and harbor molecular constituents from their parent cells, thereby offering a minimally-invasive avenue for molecular analyses. Despite such clinical potential, routine exosomal analysis, particularly the protein assay, remains challenging, due to requirements for large sample volumes and extensive processing. We have been developing miniaturized systems to facilitate clinical exosome studies. These systems can be categorized into two components: microfluidics for sample preparation and analytical tools for protein analyses. In this report, we review a new assay platform, nano-plasmonic exosome (nPLEX), in which sensing is based on surface plasmon resonance to achieve label-free exosome detection. Looking forward, we also discuss some potential challenges and improvements in exosome studies. PMID:25936957
The transfer of analytical procedures.
Ermer, J; Limberger, M; Lis, K; Wätzig, H
2013-11-01
Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.
Accelerated bridge construction (ABC) decision making and economic modeling tool.
DOT National Transportation Integrated Search
2011-12-01
In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2013 CFR
2013-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2014 CFR
2014-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview
ERIC Educational Resources Information Center
Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans
2017-01-01
Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Big data analytics in immunology: a knowledge-based approach.
Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir
2014-01-01
With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.
Update on SLD Engineering Tools Development
NASA Technical Reports Server (NTRS)
Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.
2004-01-01
The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.
Furlong, Michael; Bessire, Andrew; Song, Wei; Huntington, Christopher; Groeber, Elizabeth
2010-07-15
During routine liquid chromatography/tandem mass spectrometric (LC/MS/MS) bioanalysis of a small molecule analyte in rat serum samples from a toxicokinetic study, an unexpected interfering peak was observed in the extracted ion chromatogram of the internal standard. No interfering peaks were observed in the extracted ion chromatogram of the analyte. The dose-dependent peak area response and peak area response versus time profiles of the interfering peak suggested that it might have been related to a metabolite of the dosed compound. Further investigation using high-resolution mass spectrometry led to unequivocal identification of the interfering peak as an N-desmethyl metabolite of the parent analyte. High-resolution mass spectrometry (HRMS) was also used to demonstrate that the interfering response of the metabolite in the multiple reaction monitoring (MRM) channel of the internal standard was due to an isobaric relationship between the (13)C-isotope of the metabolite and the internal standard (i.e., common precursor ion mass), coupled with a metabolite product ion with identical mass to the product ion used in the MRM transition of the internal standard. These results emphasize (1) the need to carefully evaluate internal standard candidates with regard to potential interferences from metabolites during LC/MS/MS method development, validation and bioanalysis of small molecule analytes in biological matrices; (2) the value of HRMS as a tool to investigate unexpected interferences encountered during LC/MS/MS analysis of small molecules in biological matrices; and (3) the potential for interference regardless of choice of IS and therefore the importance of conducting assay robustness on incurred in vitro or in vivo study samples. Copyright 2010 John Wiley & Sons, Ltd.
Safenkova, Irina V; Slutskaya, Elvira S; Panferov, Vasily G; Zherdev, Anatoly V; Dzantiev, Boris B
2016-12-16
Conjugates of gold nanoparticles (GNPs) with antibodies are powerful analytical tools. It is crucial to know the conjugates' state in both the concentrated and mixed solutions used in analytical systems. Herein, we have applied asymmetrical flow field-flow fractionation (AF4) to identify the conjugates' state. The influence of a conjugate's composition and concentration on aggregation was studied in a true analytical solution (a concentrated mixture with stabilizing components). GNPs with an average diameter of 15.3±1.2nm were conjugated by adsorption with eight antibodies of different specificities. We found that, while the GNPs have a zeta potential of -31.6mV, the conjugates have zeta potentials ranging from -5.8 to -11.2mV. Increased concentrations (up to 184nM, OD 520 =80) of the mixed conjugate (mixture of eight conjugates) did not change the form of fractograms, and the peak areas' dependence on concentration was strongly linear (R 2 values of 0.99919 and 0.99845 for absorption signal and light scattering, respectively). Based on the gyration (R g ) and hydrodynamic (R h ) radii measured during fractionation, we found that the nanoparticles were divided into two populations: (1) those with constant radii (R g =9.9±0.9nm; R h =14.3±0.5nm); and (2) those with increased radii from 9.9 to 24.4nm for R g and from 14.3 to 28.1nm for R h . These results confirm that the aggregate state of the concentrated and mixed conjugates' preparations is the same as that of diluted preparations and that AF4 efficiently characterizes the conjugates' state in a true analytical solution. Copyright © 2016 Elsevier B.V. All rights reserved.
76 FR 70517 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...
Combined sensing platform for advanced diagnostics in exhaled mouse breath
NASA Astrophysics Data System (ADS)
Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris
2013-03-01
Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.
Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert
2017-07-01
The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Advances on a Decision Analytic Approach to Exposure-Based Chemical Prioritization.
Wood, Matthew D; Plourde, Kenton; Larkin, Sabrina; Egeghy, Peter P; Williams, Antony J; Zemba, Valerie; Linkov, Igor; Vallero, Daniel A
2018-05-11
The volume and variety of manufactured chemicals is increasing, although little is known about the risks associated with the frequency and extent of human exposure to most chemicals. The EPA and the recent signing of the Lautenberg Act have both signaled the need for high-throughput methods to characterize and screen chemicals based on exposure potential, such that more comprehensive toxicity research can be informed. Prior work of Mitchell et al. using multicriteria decision analysis tools to prioritize chemicals for further research is enhanced here, resulting in a high-level chemical prioritization tool for risk-based screening. Reliable exposure information is a key gap in currently available engineering analytics to support predictive environmental and health risk assessments. An elicitation with 32 experts informed relative prioritization of risks from chemical properties and human use factors, and the values for each chemical associated with each metric were approximated with data from EPA's CP_CAT database. Three different versions of the model were evaluated using distinct weight profiles, resulting in three different ranked chemical prioritizations with only a small degree of variation across weight profiles. Future work will aim to include greater input from human factors experts and better define qualitative metrics. © 2018 Society for Risk Analysis.
Nano-biosensors to detect beta-amyloid for Alzheimer's disease management.
Kaushik, Ajeet; Jayant, Rahul Dev; Tiwari, Sneham; Vashist, Arti; Nair, Madhavan
2016-06-15
Beta-amyloid (β-A) peptides are potential biomarkers to monitor Alzheimer's diseases (AD) for diagnostic purposes. Increased β-A level is neurotoxic and induces oxidative stress in brain resulting in neurodegeneration and causes dementia. As of now, no sensitive and inexpensive method is available for β-A detection under physiological and pathological conditions. Although, available methods such as neuroimaging, enzyme-linked immunosorbent assay (ELISA), and polymerase chain reaction (PCR) detect β-A, but they are not yet extended at point-of-care (POC) due to sophisticated equipments, need of high expertize, complicated operations, and challenge of low detection limit. Recently, β-A antibody based electrochemical immuno-sensing approach has been explored to detect β-A at pM levels within 30-40 min compared to 6-8h of ELISA test. The introduction of nano-enabling electrochemical sensing technology could enable rapid detection of β-A at POC and may facilitate fast personalized health care delivery. This review explores recent advancements in nano-enabling electrochemical β-A sensing technologies towards POC application to AD management. These analytical tools can serve as an analytical tool for AD management program to obtain bio-informatics needed to optimize therapeutics for neurodegenerative diseases diagnosis management. Copyright © 2016 Elsevier B.V. All rights reserved.
Detection of munitions grade g-series nerve agents using Raman excitation at 1064 nm
NASA Astrophysics Data System (ADS)
Roy, Eric; Wilcox, Phillip G.; Hoffland, Soren; Pardoe, Ian
2015-05-01
Raman spectroscopy is a powerful tool for obtaining molecular structure information of a sample. While Raman spectroscopy is a common laboratory based analytical tool, miniaturization of opto-electronic components has allowed handheld Raman analyzers to become commercially available. These handheld systems are utilized by Military and First Responder operators tasked with rapidly identifying potentially hazardous chemicals in the field. However, one limitation of many handheld Raman detection systems is strong interference caused by fluorescence of the sample or underlying surface which obscures the characteristic Raman signature of the target analyte. Munitions grade chemical warfare agents (CWAs) are produced and stored in large batches and typically have more impurities from the storage container, degradation, or unreacted precursors. In this work, Raman spectra of munitions grade CWAs were collected using a handheld Raman spectrometer with a 1064 nm excitation laser. While Raman scattering generated by a 1064 nm laser is inherently less efficient than excitation at shorter wavelengths, high quality spectra were easily obtained due to significantly reduced fluorescence of the munitions grade CWAs. The spectra of these less pure, but more operationally relevant, munitions grade CWAs were then compared to spectra of CASARM grade CWAs, as well as Raman spectra collected using the more common 785 nm excitation laser.
Raman spectroscopy as a tool to understand Kerogen production potential
NASA Astrophysics Data System (ADS)
Khatibi, S.; Ostadhassan, M.; Mohammed, R. A.; Alexeyev, A.
2017-12-01
A lot attention has given to unconventional reservoirs specifically oil shale in North America during the last decades. Understanding Kerogen properties in terms of maturity and production potential are crucial for unconventional reservoir. Since, the amount of hydrocarbon generation is a function of kerogen type and content in the formation, and the magnitude and duration in which heat and pressure were applied. This study presents a non-destructive and fast method to determine Kerogen properties in terms of Rock-Eval parameters by means of Raman Spectroscopy. Samples were gathered from upper and lower Bakken formation, with different maturities at different depth. Raman spectroscopy as a powerful nondestructive analytical tool for molecular reconstruction was employed to find Raman spectra of different samples. In the next step, Rock-Eval was performed for each sample and different measurements were made. Then in an original approach, correlation between Rock-Eval parameters with Raman Spectroscopy results was established to fully understand how kerogen productivity potentials can be reflected on the Raman response. Results showed, maturity related parameters (RO, Tmax), S1 (already generated oil in the rock), S2 (potential hydrocarbon) and OSI (oil saturation index as indication of potential oil flow zones) can be correlated to band separation, D band intensity, G band intensity and G/D intensity, respectively. Proposed method provide a fast nondestructive method to evaluate Kerogen quality even at field without any special sample preparation.
Genetic Modifiers and Oligogenic Inheritance
Kousi, Maria; Katsanis, Nicholas
2015-01-01
Despite remarkable progress in the identification of mutations that drive genetic disorders, progress in understanding the effect of genetic background on the penetrance and expressivity of causal alleles has been modest, in part because of the methodological challenges in identifying genetic modifiers. Nonetheless, the progressive discovery of modifier alleles has improved both our interpretative ability and our analytical tools to dissect such phenomena. In this review, we analyze the genetic properties and behaviors of modifiers as derived from studies in patient populations and model organisms and we highlight conceptual and technological tools used to overcome some of the challenges inherent in modifier mapping and cloning. Finally, we discuss how the identification of these modifiers has facilitated the elucidation of biological pathways and holds the potential to improve the clinical predictive value of primary causal mutations and to develop novel drug targets. PMID:26033081
How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?
NASA Astrophysics Data System (ADS)
Wachowicz, Monica
2000-04-01
This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).
Logic flowgraph methodology - A tool for modeling embedded systems
NASA Technical Reports Server (NTRS)
Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.
1991-01-01
The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.
Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman
2011-06-01
This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.
SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...
NASA Astrophysics Data System (ADS)
Luo, Xichun; Tong, Zhen; Liang, Yingchun
2014-12-01
In this article, the shape transferability of using nanoscale multi-tip diamond tools in the diamond turning for scale-up manufacturing of nanostructures has been demonstrated. Atomistic multi-tip diamond tool models were built with different tool geometries in terms of the difference in the tip cross-sectional shape, tip angle, and the feature of tool tip configuration, to determine their effect on the applied forces and the machined nano-groove geometries. The quality of machined nanostructures was characterized by the thickness of the deformed layers and the dimensional accuracy achieved. Simulation results show that diamond turning using nanoscale multi-tip tools offers tremendous shape transferability in machining nanostructures. Both periodic and non-periodic nano-grooves with different cross-sectional shapes can be successfully fabricated using the multi-tip tools. A hypothesis of minimum designed ratio of tool tip distance to tip base width (L/Wf) of the nanoscale multi-tip diamond tool for the high precision machining of nanostructures was proposed based on the analytical study of the quality of the nanostructures fabricated using different types of the multi-tip tools. Nanometric cutting trials using nanoscale multi-tip diamond tools (different in L/Wf) fabricated by focused ion beam (FIB) were then conducted to verify the hypothesis. The investigations done in this work imply the potential of using the nanoscale multi-tip diamond tool for the deterministic fabrication of period and non-periodic nanostructures, which opens up the feasibility of using the process as a versatile manufacturing technique in nanotechnology.
Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays
Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.
2017-01-01
Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
Shelf Life of Food Products: From Open Labeling to Real-Time Measurements.
Corradini, Maria G
2018-03-25
The labels currently used on food and beverage products only provide consumers with a rough guide to their expected shelf lives because they assume that a product only experiences a limited range of predefined handling and storage conditions. These static labels do not take into consideration conditions that might shorten a product's shelf life (such as temperature abuse), which can lead to problems associated with food safety and waste. Advances in shelf-life estimation have the potential to improve the safety, reliability, and sustainability of the food supply. Selection of appropriate kinetic models and data-analysis techniques is essential to predict shelf life, to account for variability in environmental conditions, and to allow real-time monitoring. Novel analytical tools to determine safety and quality attributes in situ coupled with modern tracking technologies and appropriate predictive tools have the potential to provide accurate estimations of the remaining shelf life of a food product in real time. This review summarizes the necessary steps to attain a transition from open labeling to real-time shelf-life measurements.
Novel Substrates as Sources of Ancient DNA: Prospects and Hurdles
Green, Eleanor Joan
2017-01-01
Following the discovery in the late 1980s that hard tissues such as bones and teeth preserve genetic information, the field of ancient DNA analysis has typically concentrated upon these substrates. The onset of high-throughput sequencing, combined with optimized DNA recovery methods, has enabled the analysis of a myriad of ancient species and specimens worldwide, dating back to the Middle Pleistocene. Despite the growing sophistication of analytical techniques, the genetic analysis of substrates other than bone and dentine remain comparatively “novel”. Here, we review analyses of other biological substrates which offer great potential for elucidating phylogenetic relationships, paleoenvironments, and microbial ecosystems including (1) archaeological artifacts and ecofacts; (2) calcified and/or mineralized biological deposits; and (3) biological and cultural archives. We conclude that there is a pressing need for more refined models of DNA preservation and bespoke tools for DNA extraction and analysis to authenticate and maximize the utility of the data obtained. With such tools in place the potential for neglected or underexploited substrates to provide a unique insight into phylogenetics, microbial evolution and evolutionary processes will be realized. PMID:28703741
Monitoring the Productivity of Coastal Systems Using PH ...
The impact of nutrient inputs to the eutrophication of coastal ecosystems has been one of the great themes of coastal ecology. There have been countless studies devoted to quantifying how human sources of nutrients, in particular nitrogen (N), effect coastal water bodies. These studies, which often measure in situ concentrations of nutrients, chlorophyll, and dissolved oxygen, are often spatially and/or temporally intensive and expensive. We provide evidence from experimental mesocosms, coupled with data from the water column of a well-mixed estuary, that pH can be a quick, inexpensive, and integrative measure of net ecosystem metabolism. In some cases, this approach is a more sensitive tracer of production than direct measurements of chlorophyll and carbon-14. Taken together, our data suggest that pH is a sensitive, but often overlooked, tool for monitoring estuarine production. This presentation will explore the potential utility of pH as an indicator of ecosystem productivity. Our data suggest that pH is a sensitive and potentially integrator of net ecosystem production. It should not be overlooked, that measuring pH is quick, easy, and inexpensive, further increasing its value as an analytical tool.
Chen, Xiao-Fei; Wu, Hai-Tang; Tan, Guang-Guo; Zhu, Zhen-Yu; Chai, Yi-Feng
2011-11-01
With the expansion of herbal medicine (HM) market, the issue on how to apply up-to-date analytical tools on qualitative analysis of HMs to assure their quality, safety and efficacy has been arousing great attention. Due to its inherent characteristics of accurate mass measurements and multiple stages analysis, the integrated strategy of liquid chromatography (LC) coupled with time-of-flight mass spectrometry (TOF-MS) and ion trap mass spectrometry (IT-MS) is well-suited to be performed as qualitative analysis tool in this field. The purpose of this review is to provide an overview on the potential of this integrated strategy, including the review of general features of LC-IT-MS and LC-TOF-MS, the advantages of their combination, the common procedures for structure elucidation, the potential of LC-hybrid-IT-TOF/MS and also the summary and discussion of the applications of the integrated strategy for HM qualitative analysis (2006-2011). The advantages and future developments of LC coupled with IT and TOF-MS are highlighted.
A Decision Analytic Approach to Exposure-Based Chemical Prioritization
Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.
2013-01-01
The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664
Roche, Bryan; O’Reilly, Anthony; Gavin, Amanda; Ruiz, Maria R.; Arancibia, Gabriela
2012-01-01
Background The development of implicit tests for measuring biases and behavioral predispositions is a recent development within psychology. While such tests are usually researched within a social-cognitive paradigm, behavioral researchers have also begun to view these tests as potential tests of conditioning histories, including in the sexual domain. Objective The objective of this paper is to illustrate the utility of a behavioral approach to implicit testing and means by which implicit tests can be built to the standards of behavioral psychologists. Design Research findings illustrating the short history of implicit testing within the experimental analysis of behavior are reviewed. Relevant parallel and overlapping research findings from the field of social cognition and on the Implicit Association Test are also outlined. Results New preliminary data obtained with both normal and sex offender populations are described in order to illustrate how behavior-analytically conceived implicit tests may have potential as investigative tools for assessing histories of sexual arousal conditioning and derived stimulus associations. Conclusion It is concluded that popular implicit tests are likely sensitive to conditioned and derived stimulus associations in the history of the test-taker rather than ‘unconscious cognitions’, per se. PMID:24693346
Zhao, Chunxin; Cui, Hongbo; Duan, Jing; Zhang, Shenghai; Lv, Jiagen
2018-02-06
We report the unique self-catalyzing chemiluminescence (CL) of luminol-diazonium ion (N 2 + -luminol) and its analytical potential. Visual CL emission was initially observed when N 2 + -luminol was subjected to alkaline aqueous H 2 O 2 without the aid of any catalysts. Further experimental investigations found peroxidase-like activity of N 2 + -luminol on the cleavage of H 2 O 2 into OH • radical. Together with other experimental evidence, the CL mechanism is suggested as the activation of N 2 + -luminol and its dediazotization product 3-hydroxyl luminol by OH • radical into corresponding intermediate radicals, and then further oxidation to excited-state 3-N 2 + -phthalic acid and 3-hydroxyphthalic acid, which finally produce 415 nm CL. The self-catalyzing CL of N 2 + -luminol provides us an opportunity to achieve the attractive catalyst-free CL detection of H 2 O 2 . Experiments demonstrated the 10 -8 M level detection sensitivity to H 2 O 2 as well as to glucose or uric acid if presubjected to glucose oxidase or uricase. With the exampled determination of serum glucose and uric acid, N 2 + -luminol shows its analytical potential for other analytes linking the production or consumption of H 2 O 2 . Under physiological condition, N 2 + -luminol exhibits highly selective and sensitive CL toward 1 O 2 among the common reactive oxygen species. This capacity supports the significant application of N 2 + -luminol for detecting 1 O 2 in live animals. By imaging the arthritis in LEW rats, N 2 + -luminol CL is demonstrated as a potential tool for mapping the inflammation-relevant biological events in a live body.
Blom, H; Gösch, M
2004-04-01
The past few years we have witnessed a tremendous surge of interest in so-called array-based miniaturised analytical systems due to their value as extremely powerful tools for high-throughput sequence analysis, drug discovery and development, and diagnostic tests in medicine (see articles in Issue 1). Terminologies that have been used to describe these array-based bioscience systems include (but are not limited to): DNA-chip, microarrays, microchip, biochip, DNA-microarrays and genome chip. Potential technological benefits of introducing these miniaturised analytical systems include improved accuracy, multiplexing, lower sample and reagent consumption, disposability, and decreased analysis times, just to mention a few examples. Among the many alternative principles of detection-analysis (e.g.chemiluminescence, electroluminescence and conductivity), fluorescence-based techniques are widely used, examples being fluorescence resonance energy transfer, fluorescence quenching, fluorescence polarisation, time-resolved fluorescence, and fluorescence fluctuation spectroscopy (see articles in Issue 11). Time-dependent fluctuations of fluorescent biomolecules with different molecular properties, like molecular weight, translational and rotational diffusion time, colour and lifetime, potentially provide all the kinetic and thermodynamic information required in analysing complex interactions. In this mini-review article, we present recent extensions aimed to implement parallel laser excitation and parallel fluorescence detection that can lead to even further increase in throughput in miniaturised array-based analytical systems. We also report on developments and characterisations of multiplexing extension that allow multifocal laser excitation together with matched parallel fluorescence detection for parallel confocal dynamical fluorescence fluctuation studies at the single biomolecule level.
Foraminiferal Stable Isotope Geochemistry At The Micrometer Scale: Is It A Dream Or Reality?
NASA Astrophysics Data System (ADS)
Misra, S.; Shuttleworth, S.; Lloyd, N. S.; Sadekov, A.; Elderfield, H.
2012-12-01
Over last few decades trace metals and stable isotope compositions of foraminiferal shells became one of the major tools to study past oceans and associated climate change. Empirical calibrations of δ11B, δ18O, Mg/Ca, Cd/Ca, Ba/Ca shells compositions have linked them to various environmental parameters such as seawater pH, temperature, salinity and productivity. Despite their common use as proxies, little is known about mechanisms of trace metals incorporation into foraminiferal calcite. Trace metals partition coefficients for foraminiferal calcite is significantly different from inorganic calcite precipitates underlining strong biological control on metal transport to the calcification sites and their incorporation into the calcite. Microscale distribution of light elements isotopes (e.g. Li, B, Mg) could potentially provide unique inside into these biomineralization processes improving our understanding of foraminiferal geochemistry. In this work we explore potentials of using recent advances in analytical geochemistry by employing laser ablation and multi-collector ICP-MS to study microscale distribution of Mg isotopes across individual foraminiferal shells and δ11B, and δ7Li analyses of individual shell chambers. The analytical setup includes an Analyte.G2 193nm excimer laser ablation system with two volume ablation cell connected to a Thermo Scientific NEPTUNE Plus MC-ICP-MS with Jet Interface option. We will discuss method limitations and advantages for foraminiferal geochemistry as well as our data on Mg isotopes distribution within shells of planktonic foraminifera.
Software Tools on the Peregrine System | High-Performance Computing | NREL
Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
Advancing Risk Assessment through the Application of Systems Toxicology
Sauer, John Michael; Kleensang, André; Peitsch, Manuel C.; Hayes, A. Wallace
2016-01-01
Risk assessment is the process of quantifying the probability of a harmful effect to individuals or populations from human activities. Mechanistic approaches to risk assessment have been generally referred to as systems toxicology. Systems toxicology makes use of advanced analytical and computational tools to integrate classical toxicology and quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Three presentations including two case studies involving both in vitro and in vivo approaches described the current state of systems toxicology and the potential for its future application in chemical risk assessment. PMID:26977253
Applications of Infrared and Raman Spectroscopies to Probiotic Investigation
Santos, Mauricio I.; Gerbino, Esteban; Tymczyszyn, Elizabeth; Gomez-Zavaglia, Andrea
2015-01-01
In this review, we overview the most important contributions of vibrational spectroscopy based techniques in the study of probiotics and lactic acid bacteria. First, we briefly introduce the fundamentals of these techniques, together with the main multivariate analytical tools used for spectral interpretation. Then, four main groups of applications are reported: (a) bacterial taxonomy (Subsection 4.1); (b) bacterial preservation (Subsection 4.2); (c) monitoring processes involving lactic acid bacteria and probiotics (Subsection 4.3); (d) imaging-based applications (Subsection 4.4). A final conclusion, underlying the potentialities of these techniques, is presented. PMID:28231205
Ellena, Silvano; Viale, Alessandra; Gobetto, Roberto; Aime, Silvio
2012-08-01
Para-hydrogen-induced polarization effects have been observed in the (29)Si NMR spectra of trimethylsilyl para-hydrogenated molecules. The high signal enhancements and the long T(1) values observed for the (29)Si hyperpolarized resonances point toward the possibility of using (29)Si for hyperpolarization applications. A method for the discrimination of multiple compounds and/or complex mixtures of hydroxylic compounds (such as steroids), consisting of the silylization of alcoholic functionalities with an unsaturated silylalkyl moiety and subsequent reaction with para-H(2), is proposed. Copyright © 2012 John Wiley & Sons, Ltd.
NASTRAN as an analytical research tool for composite mechanics and composite structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.
1976-01-01
Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.
Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W
2014-12-01
Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam
2017-06-01
The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
Structuring modeling and simulation analysis for evacuation planning and operations.
DOT National Transportation Integrated Search
2009-06-01
This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...
Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.
Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà
2017-10-01
Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.
Simultaneous Multiparameter Cellular Energy Metabolism Profiling of Small Populations of Cells.
Kelbauskas, Laimonas; Ashili, Shashaanka P; Lee, Kristen B; Zhu, Haixin; Tian, Yanqing; Meldrum, Deirdre R
2018-03-12
Functional and genomic heterogeneity of individual cells are central players in a broad spectrum of normal and disease states. Our knowledge about the role of cellular heterogeneity in tissue and organism function remains limited due to analytical challenges one encounters when performing single cell studies in the context of cell-cell interactions. Information based on bulk samples represents ensemble averages over populations of cells, while data generated from isolated single cells do not account for intercellular interactions. We describe a new technology and demonstrate two important advantages over existing technologies: first, it enables multiparameter energy metabolism profiling of small cell populations (<100 cells)-a sample size that is at least an order of magnitude smaller than other, commercially available technologies; second, it can perform simultaneous real-time measurements of oxygen consumption rate (OCR), extracellular acidification rate (ECAR), and mitochondrial membrane potential (MMP)-a capability not offered by any other commercially available technology. Our results revealed substantial diversity in response kinetics of the three analytes in dysplastic human epithelial esophageal cells and suggest the existence of varying cellular energy metabolism profiles and their kinetics among small populations of cells. The technology represents a powerful analytical tool for multiparameter studies of cellular function.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Development of a new semi-analytical model for cross-borehole flow experiments in fractured media
Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.
2015-01-01
Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.
Ialongo, Cristiano; Bernardini, Sergio
2018-06-18
There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.
Nanoscaled aptasensors for multi-analyte sensing
Saberian-Borujeni, Mehdi; Johari-Ahar, Mohammad; Hamzeiy, Hossein; Barar, Jaleh; Omidi, Yadollah
2014-01-01
Introduction: Nanoscaled aptamers (Aps), as short single-stranded DNA or RNA oligonucleotides, are able to bind to their specific targets with high affinity, upon which they are considered as powerful diagnostic and analytical sensing tools (the so-called "aptasensors"). Aptamers are selected from a random pool of oligonucleotides through a procedure known as "systematic evolution of ligands by exponential enrichment". Methods: In this work, the most recent studies in the field of aptasensors are reviewed and discussed with a main focus on the potential of aptasensors for the multianalyte detection(s). Results: Due to the specific folding capability of aptamers in the presence of analyte, aptasensors have substantially successfully been exploited for the detection of a wide range of small and large molecules (e.g., drugs and their metabolites, toxins, and associated biomarkers in various diseases) at very low concentrations in the biological fluids/samples even in presence of interfering species. Conclusion: Biological samples are generally considered as complexes in the real biological media. Hence, the development of aptasensors with capability to determine various targets simultaneously within a biological matrix seems to be our main challenge. To this end, integration of various key scientific dominions such as bioengineering and systems biology with biomedical researches are inevitable. PMID:25671177
NASA Astrophysics Data System (ADS)
Huang, Junqi; Goltz, Mark N.
2005-11-01
The potential for using pairs of so-called horizontal flow treatment wells (HFTWs) to effect in situ capture and treatment of contaminated groundwater has recently been demonstrated. To apply this new technology, design engineers need to be able to simulate the relatively complex groundwater flow patterns that result from HFTW operation. In this work, a three-dimensional analytical solution for steady flow in a homogeneous, anisotropic, contaminated aquifer is developed to efficiently calculate the interflow of water circulating between a pair of HFTWs and map the spatial extent of contaminated groundwater flowing from upgradient that is captured. The solution is constructed by superposing the solutions for the flow fields resulting from operation of partially penetrating wells. The solution is used to investigate the flow resulting from operation of an HFTW well pair and to quantify how aquifer anisotropy, well placement, and pumping rate impact capture zone width and interflow. The analytical modeling method presented here provides a fast and accurate technique for representing the flow field resulting from operation of HFTW systems, and represents a tool that can be useful in designing in situ groundwater contamination treatment systems.
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
SolarPILOT | Concentrating Solar Power | NREL
tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Lectindb: a plant lectin database.
Chandra, Nagasuma R; Kumar, Nirmal; Jeyakani, Justin; Singh, Desh Deepak; Gowda, Sharan B; Prathima, M N
2006-10-01
Lectins, a class of carbohydrate-binding proteins, are now widely recognized to play a range of crucial roles in many cell-cell recognition events triggering several important cellular processes. They encompass different members that are diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities, and specificities as well as their larger biological roles and potential applications. It is not surprising, therefore, that the vast amount of experimental data on lectins available in the literature is so diverse, that it becomes difficult and time consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. To achieve an effective use of all the data toward understanding the function and their possible applications, an organization of these seemingly independent data into a common framework is essential. An integrated knowledge base ( Lectindb, http://nscdb.bic.physics.iisc.ernet.in ) together with appropriate analytical tools has therefore been developed initially for plant lectins by collating and integrating diverse data. The database has been implemented using MySQL on a Linux platform and web-enabled using PERL-CGI and Java tools. Data for each lectin pertain to taxonomic, biochemical, domain architecture, molecular sequence, and structural details as well as carbohydrate and hence blood group specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value not only for basic studies in lectin biology but also for basic studies in pursuing several applications in biotechnology, immunology, and clinical practice, using these molecules.
Visual programming for next-generation sequencing data analytics.
Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia
2016-01-01
High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.
An improved multiple flame photometric detector for gas chromatography.
Clark, Adrian G; Thurbide, Kevin B
2015-11-20
An improved multiple flame photometric detector (mFPD) is introduced, based upon interconnecting fluidic channels within a planar stainless steel (SS) plate. Relative to the previous quartz tube mFPD prototype, the SS mFPD provides a 50% reduction in background emission levels, an orthogonal analytical flame, and easier more sensitive operation. As a result, sulfur response in the SS mFPD spans 4 orders of magnitude, yields a minimum detectable limit near 9×10(-12)gS/s, and has a selectivity approaching 10(4) over carbon. The device also exhibits exceptionally large resistance to hydrocarbon response quenching. Additionally, the SS mFPD uniquely allows analyte emission monitoring in the multiple worker flames for the first time. The findings suggest that this mode can potentially further improve upon the analytical flame response of sulfur (both linear HSO, and quadratic S2) and also phosphorus. Of note, the latter is nearly 20-fold stronger in S/N in the collective worker flames response and provides 6 orders of linearity with a detection limit of about 2.0×10(-13)gP/s. Overall, the results indicate that this new SS design notably improves the analytical performance of the mFPD and can provide a versatile and beneficial monitoring tool for gas chromatography. Copyright © 2015 Elsevier B.V. All rights reserved.
Wi, Seung Gon; Kim, Hyun Joo; Mahadevan, Shobana Arumugam; Yang, Duck-Joo; Bae, Hyeun-Jong
2009-12-01
Sea weed (Ceylon moss) possesses comparable bioenergy production potential to that of land plants. Ceylon moss has high content of carbohydrates, typically galactose (23%) and glucose (20%). We have explored the possibility of sodium chlorite in Ceylon moss pretreatment that can ultimately increase the efficiency of enzymatic saccharification. In an acidic medium, chlorite generates ClO(2) molecules that transform lignin into soluble compounds without any significant loss of carbohydrate content and this procedure is widely used as an analytical method for holocellulose determination. Sodium chlorite-pretreated samples resulted in glucose yield up to 70% with contrast of only 5% was obtained from non-pretreated samples. The efficiency of enzymatic hydrolysis is significantly improved by sodium chlorite pretreatment, and thus sodium chlorite pretreatment is potentially a very useful tool in the utilisation of Ceylon moss biomass for ethanol production or bioenergy purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayden, D. W.
This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less
Hean, Sarah; Willumsen, Elisabeth; Ødegård, Atle
2018-06-11
Purpose Effective collaboration between mental health services (MHS) and criminal justice services (CJS) impacts on mental illness and reduces reoffending rates. This paper proposes the change laboratory model (CLM) of workplace transformation as a potential tool to support interagency collaborative practice that has potential to complement current integration tools used in this context. The purpose of this paper is to focus specifically on the theoretical dimension of the model: the cultural historical activity systems theory (CHAT) as a theoretical perspective that offers a framework with which interactions between the MHS and CJS can be better understood. Design/methodology/approach The structure and rationale behind future piloting of the change laboratory in this context is made. Then CHAT theory is briefly introduced and then its utility illustrated in the presentation of the findings of a qualitative study of leaders from MHS and CJS that explored their perspectives of the characteristics of collaborative working between MHS and prison/probation services in a Norwegian context and using CHAT as an analytical framework. Findings Leaders suggested that interactions between the two services, within the Norwegian system at least, are most salient when professionals engage in the reintegration and rehabilitation of the offender. Achieving effective communication within the boundary space between the two systems is a focus for professionals engaging in interagency working and this is mediated by a range of integration tools such as coordination plans and interagency meetings. Formalised interagency agreements and informal, unspoken norms of interaction governed this activity. Key challenges limiting the collaboration between the two systems included resource limitations, logistical issues and differences in professional judgments on referral and confidentiality. Originality/value Current tools with which MHS/CJS interactions are understood and managed, fail to make explicit the dimensions and nature of these complex interactions. The CLM, and CHAT as its theoretical underpinning, has been highly successful internationally and in other clinical contexts, as a means of exploring and developing interagency working. It is a new idea in prison development, none as yet being applied to the challenges facing the MHS and CJS. This paper addresses this by illustrating the use of CHAT as an analytical framework with which to articulate MHS/CJS collaborations and the potential of the CLM more widely to address current challenges in a context specific, bottom-up and fluid approach to interagency working in this environment.
Bio-TDS: bioscience query tool discovery system.
Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M
2017-01-04
Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
New tools for investigating student learning in upper-division electrostatics
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.
Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.
Big data analytics in healthcare: promise and potential.
Raghupathi, Wullianallur; Raghupathi, Viju
2014-01-01
To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.
Measurements and Predictions for a Distributed Exhaust Nozzle
NASA Technical Reports Server (NTRS)
Kinzie, Kevin W.; Brown, Martha C.; Schein, David B.; Solomon, W. David, Jr.
2001-01-01
The acoustic and aerodynamic performance characteristics of a distributed exhaust nozzle (DEN) design concept were evaluated experimentally and analytically with the purpose of developing a design methodology for developing future DEN technology. Aerodynamic and acoustic measurements were made to evaluate the DEN performance and the CFD design tool. While the CFD approach did provide an excellent prediction of the flowfield and aerodynamic performance characteristics of the DEN and 2D reference nozzle, the measured acoustic suppression potential of this particular DEN was low. The measurements and predictions indicated that the mini-exhaust jets comprising the distributed exhaust coalesced back into a single stream jet very shortly after leaving the nozzles. Even so, the database provided here will be useful for future distributed exhaust designs with greater noise reduction and aerodynamic performance potential.
Trifiletti, Daniel M.; Showalter, Timothy N.
2015-01-01
Several advances in large data set collection and processing have the potential to provide a wave of new insights and improvements in the use of radiation therapy for cancer treatment. The era of electronic health records, genomics, and improving information technology resources creates the opportunity to leverage these developments to create a learning healthcare system that can rapidly deliver informative clinical evidence. By merging concepts from comparative effectiveness research with the tools and analytic approaches of “big data,” it is hoped that this union will accelerate discovery, improve evidence for decision making, and increase the availability of highly relevant, personalized information. This combination offers the potential to provide data and analysis that can be leveraged for ultra-personalized medicine and high-quality, cutting-edge radiation therapy. PMID:26697409
Trifiletti, Daniel M; Showalter, Timothy N
2015-01-01
Several advances in large data set collection and processing have the potential to provide a wave of new insights and improvements in the use of radiation therapy for cancer treatment. The era of electronic health records, genomics, and improving information technology resources creates the opportunity to leverage these developments to create a learning healthcare system that can rapidly deliver informative clinical evidence. By merging concepts from comparative effectiveness research with the tools and analytic approaches of "big data," it is hoped that this union will accelerate discovery, improve evidence for decision making, and increase the availability of highly relevant, personalized information. This combination offers the potential to provide data and analysis that can be leveraged for ultra-personalized medicine and high-quality, cutting-edge radiation therapy.
Manipulability, force, and compliance analysis for planar continuum manipulators
NASA Technical Reports Server (NTRS)
Gravagne, Ian A.; Walker, Ian D.
2002-01-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Manipulability, force, and compliance analysis for planar continuum manipulators.
Gravagne, Ian A; Walker, Ian D
2002-06-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Data and Tools | Concentrating Solar Power | NREL
download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etmektzoglou, A; Mishra, P; Svatos, M
Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly translate research ideas into machine readable scripts without programming knowledge. As an open source initiative, it also enables researcher collaboration on future developments. I am a full time employee at Varian Medical Systems, Palo Alto, California.« less
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
Thermostatistical description of gas mixtures from space partitions
NASA Astrophysics Data System (ADS)
Rohrmann, R. D.; Zorec, J.
2006-10-01
The new mathematical framework based on the free energy of pure classical fluids presented by Rohrmann [Physica A 347, 221 (2005)] is extended to multicomponent systems to determine thermodynamic and structural properties of chemically complex fluids. Presently, the theory focuses on D -dimensional mixtures in the low-density limit (packing factor η<0.01 ). The formalism combines the free-energy minimization technique with space partitions that assign an available volume v to each particle. v is related to the closeness of the nearest neighbor and provides a useful tool to evaluate the perturbations experimented by particles in a fluid. The theory shows a close relationship between statistical geometry and statistical mechanics. New, unconventional thermodynamic variables and mathematical identities are derived as a result of the space division. Thermodynamic potentials μil , conjugate variable of the populations Nil of particles class i with the nearest neighbors of class l are defined and their relationships with the usual chemical potentials μi are established. Systems of hard spheres are treated as illustrative examples and their thermodynamics functions are derived analytically. The low-density expressions obtained agree nicely with those of scaled-particle theory and Percus-Yevick approximation. Several pair distribution functions are introduced and evaluated. Analytical expressions are also presented for hard spheres with attractive forces due to Kac-tails and square-well potentials. Finally, we derive general chemical equilibrium conditions.
Buonfiglio, Marzia; Toscano, M; Puledda, F; Avanzini, G; Di Clemente, L; Di Sabato, F; Di Piero, V
2015-03-01
Habituation is considered one of the most basic mechanisms of learning. Habituation deficit to several sensory stimulations has been defined as a trait of migraine brain and also observed in other disorders. On the other hand, analytic information processing style is characterized by the habit of continually evaluating stimuli and it has been associated with migraine. We investigated a possible correlation between lack of habituation of evoked visual potentials and analytic cognitive style in healthy subjects. According to Sternberg-Wagner self-assessment inventory, 15 healthy volunteers (HV) with high analytic score and 15 HV with high global score were recruited. Both groups underwent visual evoked potentials recordings after psychological evaluation. We observed significant lack of habituation in analytical individuals compared to global group. In conclusion, a reduced habituation of visual evoked potentials has been observed in analytic subjects. Our results suggest that further research should be undertaken regarding the relationship between analytic cognitive style and lack of habituation in both physiological and pathophysiological conditions.
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D
2017-11-07
Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.
NASA Technical Reports Server (NTRS)
Hiltner, Dale W.
2000-01-01
This report presents the assessment of an analytical tool developed as part of the NASA/FAA Tailplane Icing Program. The analytical tool is a specialized simulation program called TAILSM4 which was developed to model the effects of tailplane icing on the flight dynamics Twin Otter Icing Research Aircraft. This report compares the responses of the TAILSIM program directly to flight test data. The comparisons should be useful to potential users of TAILSIM. The comparisons show that the TAILSIM program qualitatively duplicates the flight test aircraft response during maneuvers with ice on the tailplane. TAILSIM is shown to be quantitatively "in the ballpark" in predicting when Ice Contaminated Tailplane Stall will occur during pushover and thrust transition maneuvers. As such, TAILSIM proved its usefulness to the flight test program by providing a general indication of the aircraft configuration and flight conditions of concern. The aircraft dynamics are shown to be modeled correctly by the equations of motion used in TAILSIM. However, the general accuracy of the TAILSIM responses is shown to be less than desired primarily due to inaccuracies in the aircraft database. The high sensitivity of the TAILSIM program responses to small changes in load factor command input is also shown to be a factor in the accuracy of the responses. A pilot model is shown to allow TAILSIM to produce more accurate responses and contribute significantly to the usefulness of the program. Suggestions to improve the accuracy of the TAILSIM responses are to further refine the database representation of the aircraft aerodynamics and tailplane flowfield and to explore a more realistic definition of the pilot model.
Mahmoud, Bahaa G; Khairy, Mohamed; Rashwan, Farouk A; Banks, Craig E
2017-02-07
To overcome the recent outbreaks of hepatotoxicity-related drugs, a new analytical tool for the continuously determination of these drugs in human fluids is required. Electrochemical-based analytical methods offer an effective, rapid, and simple tool for on-site determination of various organic and inorganic species. However, the design of a sensitive, selective, stable, and reproducible sensor is still a major challenge. In the present manuscript, a facile, one-pot hydrothermal synthesis of bismuth oxide (Bi 2 O 2.33 ) nanostructures (nanorods) was developed. These BiO nanorods were cast onto mass disposable graphite screen-printed electrodes (BiO-SPEs), allowing the ultrasensitive determination of acetaminophen (APAP) in the presence of its common interference isoniazid (INH), which are both found in drug samples. The simultaneous electroanalytical sensing using BiO-SPEs exhibited strong electrocatalytic activity toward the sensing of APAP and INH with an enhanced analytical signal (voltammetric peak) over that achievable at unmodified (bare) SPEs. The electroanalytical sensing of APAP and INH are possible with accessible linear ranges from 0.5 to 1250 μM and 5 to 1760 μM with limits of detection (3σ) of 30 nM and 1.85 μM, respectively. The stability, reproducibility, and repeatability of BiO-SPE were also investigated. The BiO-SPEs were evaluated toward the sensing of APAP and INH in human serum, urine, saliva, and tablet samples. The results presented in this paper demonstrate that BiO-SPEs sensing platforms provide a potential candidate for the accurate determination of APAP and INH within human fluids and pharmaceutical formulations.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
NASA Technical Reports Server (NTRS)
Kempler, Steve; Mathews, Tiffany
2016-01-01
The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
Physics of cosmological cascades and observable properties
NASA Astrophysics Data System (ADS)
Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.
2017-04-01
TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.
Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S
2015-10-01
The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Visualizing statistical significance of disease clusters using cartograms.
Kronenfeld, Barry J; Wong, David W S
2017-05-15
Health officials and epidemiological researchers often use maps of disease rates to identify potential disease clusters. Because these maps exaggerate the prominence of low-density districts and hide potential clusters in urban (high-density) areas, many researchers have used density-equalizing maps (cartograms) as a basis for epidemiological mapping. However, we do not have existing guidelines for visual assessment of statistical uncertainty. To address this shortcoming, we develop techniques for visual determination of statistical significance of clusters spanning one or more districts on a cartogram. We developed the techniques within a geovisual analytics framework that does not rely on automated significance testing, and can therefore facilitate visual analysis to detect clusters that automated techniques might miss. On a cartogram of the at-risk population, the statistical significance of a disease cluster is determinate from the rate, area and shape of the cluster under standard hypothesis testing scenarios. We develop formulae to determine, for a given rate, the area required for statistical significance of a priori and a posteriori designated regions under certain test assumptions. Uniquely, our approach enables dynamic inference of aggregate regions formed by combining individual districts. The method is implemented in interactive tools that provide choropleth mapping, automated legend construction and dynamic search tools to facilitate cluster detection and assessment of the validity of tested assumptions. A case study of leukemia incidence analysis in California demonstrates the ability to visually distinguish between statistically significant and insignificant regions. The proposed geovisual analytics approach enables intuitive visual assessment of statistical significance of arbitrarily defined regions on a cartogram. Our research prompts a broader discussion of the role of geovisual exploratory analyses in disease mapping and the appropriate framework for visually assessing the statistical significance of spatial clusters.
We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...
Evaluating the compatibility of multi-functional and intensive urban land uses
NASA Astrophysics Data System (ADS)
Taleai, M.; Sharifi, A.; Sliuzas, R.; Mesgari, M.
2007-12-01
This research is aimed at developing a model for assessing land use compatibility in densely built-up urban areas. In this process, a new model was developed through the combination of a suite of existing methods and tools: geographical information system, Delphi methods and spatial decision support tools: namely multi-criteria evaluation analysis, analytical hierarchy process and ordered weighted average method. The developed model has the potential to calculate land use compatibility in both horizontal and vertical directions. Furthermore, the compatibility between the use of each floor in a building and its neighboring land uses can be evaluated. The method was tested in a built-up urban area located in Tehran, the capital city of Iran. The results show that the model is robust in clarifying different levels of physical compatibility between neighboring land uses. This paper describes the various steps and processes of developing the proposed land use compatibility evaluation model (CEM).
Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique
NASA Astrophysics Data System (ADS)
Goldberg, Florent
1998-09-01
Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.
Interpreting short tandem repeat variations in humans using mutational constraint
Gymrek, Melissa; Willems, Thomas; Reich, David; Erlich, Yaniv
2017-01-01
Identifying regions of the genome that are depleted of mutations can reveal potentially deleterious variants. Short tandem repeats (STRs), also known as microsatellites, are among the largest contributors of de novo mutations in humans. However, per-locus studies of STR mutations have been limited to highly ascertained panels of several dozen loci. Here, we harnessed bioinformatics tools and a novel analytical framework to estimate mutation parameters for each STR in the human genome by correlating STR genotypes with local sequence heterozygosity. We applied our method to obtain robust estimates of the impact of local sequence features on mutation parameters and used this to create a framework for measuring constraint at STRs by comparing observed vs. expected mutation rates. Constraint scores identified known pathogenic variants with early onset effects. Our metric will provide a valuable tool for prioritizing pathogenic STRs in medical genetics studies. PMID:28892063
A spectral Poisson solver for kinetic plasma simulation
NASA Astrophysics Data System (ADS)
Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf
2011-10-01
Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.
Growing trend of CE at the omics level: the frontier of systems biology--an update.
Ban, Eunmi; Park, Soo Hyun; Kang, Min-Jung; Lee, Hyun-Jung; Song, Eun Joo; Yoo, Young Sook
2012-01-01
Omics is the study of proteins, peptides, genes, and metabolites in living organisms. Systems biology aims to understand the system through the study of the relationship between elements such as genes and proteins in biological system. Recently, systems biology emerged as the result of the advanced development of high-throughput analysis technologies such as DNA sequencers, DNA arrays, and mass spectrometry for omics studies. Among a number of analytical tools and technologies, CE and CE coupled to MS are promising and relatively rapidly developing tools with the potential to provide qualitative and quantitative analyses of biological molecules. With an emphasis on CE for systems biology, this review summarizes the method developments and applications of CE for the genomic, transcriptomic, proteomic, and metabolomic studies focusing on the drug discovery and disease diagnosis and therapies since 2009. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bioinformatics tools in predictive ecology: applications to fisheries.
Tucker, Allan; Duplisea, Daniel
2012-01-19
There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their 'crossover potential' with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse.
Analytical Model-Based Design Optimization of a Transverse Flux Machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
Werber, D; Bernard, H
2014-02-27
Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.
Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.
Monteiro, Tiago; Almeida, Maria Gabriela
2018-05-14
Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.
Multivariable Hermite polynomials and phase-space dynamics
NASA Technical Reports Server (NTRS)
Dattoli, G.; Torre, Amalia; Lorenzutta, S.; Maino, G.; Chiccoli, C.
1994-01-01
The phase-space approach to classical and quantum systems demands for advanced analytical tools. Such an approach characterizes the evolution of a physical system through a set of variables, reducing to the canonically conjugate variables in the classical limit. It often happens that phase-space distributions can be written in terms of quadratic forms involving the above quoted variables. A significant analytical tool to treat these problems may come from the generalized many-variables Hermite polynomials, defined on quadratic forms in R(exp n). They form an orthonormal system in many dimensions and seem the natural tool to treat the harmonic oscillator dynamics in phase-space. In this contribution we discuss the properties of these polynomials and present some applications to physical problems.
Kumar, B. Vinodh; Mohan, Thuthi
2018-01-01
OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587
Tool to Prioritize Energy Efficiency Investments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farese, P.; Gelman, R.; Hendron, R.
2012-08-01
To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.
Tools for Educational Data Mining: A Review
ERIC Educational Resources Information Center
Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan
2017-01-01
In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.
Predictive Data Tools Find Uses in Schools
ERIC Educational Resources Information Center
Sparks, Sarah D.
2011-01-01
The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…
A consumer guide: tools to manage vegetation and fuels.
David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt
2007-01-01
Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...
Insights from Smart Meters: The Potential for Peak-Hour Savings from Behavior-Based Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Annika; Perry, Michael; Smith, Brian
The rollout of smart meters in the last several years has opened up new forms of previously unavailable energy data. Many utilities are now able in real-time to capture granular, household level interval usage data at very high-frequency levels for a large proportion of their residential and small commercial customer population. This can be linked to other time and locationspecific information, providing vast, constantly growing streams of rich data (sometimes referred to by the recently popular buzz word, “big data”). Within the energy industry there is increasing interest in tapping into the opportunities that these data can provide. What canmore » we do with all of these data? The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull interesting insights out of this highfrequency, human-focused data. We at LBNL are calling this “behavior analytics”. This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, highly disaggregated and heterogeneous information about actual energy use would allow energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; would enable evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and would provide better insights in to the energy and peak hour savings associated with specifics types of EE and DR programs (e.g., behavior-based (BB) programs). In this series, “Insights from Smart Meters”, we will present concrete, illustrative examples of the type of value that insights from behavior analytics of these data can provide (as well as pointing out its limitations). We will supply several types of key findings, including: • Novel results, which answer questions the industry previously was unable to answer; • Proof-of-concept analytics tools that can be adapted and used by others; and • Guidelines and protocols that summarize analytical best practices. This report focuses on one example of the kind of value that analysis of this data can provide: insights into whether behavior-based (BB) efficiency programs have the potential to provide peak-hour energy savings.« less
Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies
Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon
2012-01-01
Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600
Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig
2018-05-17
Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
The Distributed Geothermal Market Demand Model (dGeo): Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O
The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less
Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice
NASA Astrophysics Data System (ADS)
Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.
2013-10-01
Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d
ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.
Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa
2016-05-01
The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hu, Xian-Quan; Luo, Guang; Cui, Li-Peng; Li, Fang-Yu; Niu, Lian-Bin
2009-03-01
The analytic solution of the radial Schrödinger equation is studied by using the tight coupling condition of several positive-power and inverse-power potential functions in this article. Furthermore, the precisely analytic solutions and the conditions that decide the existence of analytic solution have been searched when the potential of the radial Schrödinger equation is V(r) = α1r8 + α2r3 + α3r2 + β3r-1 + β2r-3 + β1r-4. Generally speaking, there is only an approximate solution, but not analytic solution for Schrödinger equation with several potentials' superposition. However, the conditions that decide the existence of analytic solution have been found and the analytic solution and its energy level structure are obtained for the Schrödinger equation with the potential which is motioned above in this paper. According to the single-value, finite and continuous standard of wave function in a quantum system, the authors firstly solve the asymptotic solution through the radial coordinate r → and r → 0; secondly, they make the asymptotic solutions combining with the series solutions nearby the neighborhood of irregular singularities; and then they compare the power series coefficients, deduce a series of analytic solutions of the stationary state wave function and corresponding energy level structure by tight coupling among the coefficients of potential functions for the radial Schrödinger equation; and lastly, they discuss the solutions and make conclusions.
Constraint-Referenced Analytics of Algebra Learning
ERIC Educational Resources Information Center
Sutherland, Scot M.; White, Tobin F.
2016-01-01
The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…
Towards an Analytic Foundation for Network Architecture
2010-12-31
SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
NASA Astrophysics Data System (ADS)
Kerst, Stijn; Shyrokau, Barys; Holweg, Edward
2018-05-01
This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.
Measures of societal risk and their potential use in civil aviation.
Horn, Mark E T; Fulton, Neale; Westcott, Mark
2008-12-01
This article seeks to clarify the conceptual foundations of measures of societal risk, to investigate how such measures may be used validly in commonly encountered policy contexts, and to explore the application of these measures in the field of civil aviation. The article begins by examining standard measures of societal and individual risk (SR and IR), with attention given to ethical as well as analytical considerations. A comprehensive technical analysis of SR is provided, encompassing scalar risk measures, barrier functions, and a utility-based formulation, and clarifications are offered with respect to the treatment of SR in recent publications. The policy context for SR measures is shown to be critically important, and an extension to a hierarchical setting is developed. The prospects for applying SR to civil aviation are then considered, and some technical and conceptual issues are identified. SR appears to be a useful analytical tool in this context, provided that careful attention is given to these issues.
Shared decision-making – transferring research into practice: the Analytic Hierarchy Process (AHP)
Dolan, James G.
2008-01-01
Objective To illustrate how the Analytic Hierarchy Process (AHP) can be used to promote shared decision-making and enhance clinician-patient communication. Methods Tutorial review. Results The AHP promotes shared decision making by creating a framework that is used to define the decision, summarize the information available, prioritize information needs, elicit preferences and values, and foster meaningful communication among decision stakeholders. Conclusions The AHP and related multi-criteria methods have the potential for improving the quality of clinical decisions and overcoming current barriers to implementing shared decision making in busy clinical settings. Further research is needed to determine the best way to implement these tools and to determine their effectiveness. Practice Implications Many clinical decisions involve preference-based trade-offs between competing risks and benefits. The AHP is a well-developed method that provides a practical approach for improving patient-provider communication, clinical decision-making, and the quality of patient care in these situations. PMID:18760559
Bulbul, Gonca; Chaves, Gepoliano; Olivier, Joseph; Ozel, Rifat Emrah; Pourmand, Nader
2018-06-06
Examining the behavior of a single cell within its natural environment is valuable for understanding both the biological processes that control the function of cells and how injury or disease lead to pathological change of their function. Single-cell analysis can reveal information regarding the causes of genetic changes, and it can contribute to studies on the molecular basis of cell transformation and proliferation. By contrast, whole tissue biopsies can only yield information on a statistical average of several processes occurring in a population of different cells. Electrowetting within a nanopipette provides a nanobiopsy platform for the extraction of cellular material from single living cells. Additionally, functionalized nanopipette sensing probes can differentiate analytes based on their size, shape or charge density, making the technology uniquely suited to sensing changes in single-cell dynamics. In this review, we highlight the potential of nanopipette technology as a non-destructive analytical tool to monitor single living cells, with particular attention to integration into applications in molecular biology.
MALDI mass spectrometry imaging, from its origins up to today: the state of the art.
Francese, Simona; Dani, Francesca R; Traldi, Pietro; Mastrobuoni, Guido; Pieraccini, Giuseppe; Moneti, Gloriano
2009-02-01
Mass Spectrometry (MS) has a number of features namely sensitivity, high dynamic range, high resolution, and versatility which make it a very powerful analytical tool for a wide spectrum of applications spanning all the life science fields. Among all the MS techniques, MALDI Imaging mass spectrometry (MALDI MSI) is currently one of the most exciting both for its rapid technological improvements, and for its great potential in high impact bioscience fields. Here, MALDI MSI general principles are described along with technical and instrumental details as well as application examples. Imaging MS instruments and imaging mass spectrometric techniques other than MALDI, are presented along with examples of their use. As well as reporting MSI successes in several bioscience fields, an attempt is made to take stock of what has been achieved so far with this technology and to discuss the analytical and technological advances required for MSI to be applied as a routine technique in clinical diagnostics, clinical monitoring and in drug discovery.
Yavorskyy, Alexander; Hernandez-Santana, Aaron; McCarthy, Geraldine
2008-01-01
Clinically, osteoarthritis (OA) is characterised by joint pain, stiffness after immobility, limitation of movement and, in many cases, the presence of basic calcium phosphate (BCP) crystals in the joint fluid. The detection of BCP crystals in the synovial fluid of patients with OA is fraught with challenges due to the submicroscopic size of BCP, the complex nature of the matrix in which they are found and the fact that other crystals can co-exist with them in cases of mixed pathology. Routine analysis of joint crystals still relies almost exclusively on the use of optical microscopy, which has limited applicability for BCP crystal identification due to limited resolution and the inherent subjectivity of the technique. The purpose of this Critical Review is to present an overview of some of the main analytical tools employed in the detection of BCP to date and the potential of emerging technologies such as atomic force microscopy (AFM) and Raman microspectroscopy for this purpose. PMID:18299743
Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.
Buckley, Kevin; Ryder, Alan G
2017-06-01
The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.
Initiating an Online Reputation Monitoring System with Open Source Analytics Tools
NASA Astrophysics Data System (ADS)
Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd
2018-05-01
Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.