Development of computer-based analytical tool for assessing physical protection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less
Development of computer-based analytical tool for assessing physical protection system
NASA Astrophysics Data System (ADS)
Mardhi, Alim; Pengvanich, Phongphaeth
2016-01-01
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.
Visual analytics for aviation safety: A collaborative approach to sensemaking
NASA Astrophysics Data System (ADS)
Wade, Andrew
Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.
FPI: FM Success through Analytics
ERIC Educational Resources Information Center
Hickling, Duane
2013-01-01
The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…
Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring
Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia
2010-01-01
The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.
2008-07-30
As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less
Accelerated bridge construction (ABC) decision making and economic modeling tool.
DOT National Transportation Integrated Search
2011-12-01
In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...
Visualizing Qualitative Information
ERIC Educational Resources Information Center
Slone, Debra J.
2009-01-01
The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
ERIC Educational Resources Information Center
Kim, Jeonghyun; Jo, Il-Hyun; Park, Yeonjeong
2016-01-01
The learning analytics dashboard (LAD) is a newly developed learning support tool for virtual classrooms that is believed to allow students to review their online learning behavior patterns intuitively through the provision of visual information. The purpose of this study was to empirically validate the effects of LAD. An experimental study was…
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
A consumer guide: tools to manage vegetation and fuels.
David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt
2007-01-01
Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...
Applying Pragmatics Principles for Interaction with Visual Analytics.
Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac
2018-01-01
Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.
Investigating Analytic Tools for e-Book Design in Early Literacy Learning
ERIC Educational Resources Information Center
Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah
2009-01-01
Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…
The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools
ERIC Educational Resources Information Center
Yang, Min; Wong, Stephen C. P.; Coid, Jeremy
2010-01-01
Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Design and Implementation of a Learning Analytics Toolkit for Teachers
ERIC Educational Resources Information Center
Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik
2012-01-01
Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…
Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study
ERIC Educational Resources Information Center
Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek
2013-01-01
Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…
FIA BioSum: a tool to evaluate financial costs, opportunities and effectiveness of fuel treatments.
Jeremy Fried; Glenn Christensen
2004-01-01
FIA BioSum, a tool developed by the USDA Forest Services Forest Inventory and Analysis (FIA) Program, generates reliable cost estimates, identifies opportunities and evaluates the effectiveness of fuel treatments in forested landscapes. BioSum is an analytic framework that integrates a suite of widely used computer models with a foundation of attribute-rich,...
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).
Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D
2017-01-01
Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.
Scalable Visual Analytics of Massive Textual Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.
2007-04-01
This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
The generation of criteria for selecting analytical tools for landscape management
Marilyn Duffey-Armstrong
1979-01-01
This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...
ERIC Educational Resources Information Center
Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.
2010-01-01
The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…
ERIC Educational Resources Information Center
Schoendorff, Benjamin; Steinwachs, Joanne
2012-01-01
How can therapists be effectively trained in clinical functional contextualism? In this conceptual article we propose a new way of training therapists in Acceptance and Commitment Therapy skills using tools from Functional Analytic Psychotherapy in a training context functionally similar to the therapeutic relationship. FAP has been successfully…
Spectral radiation analyses of the GOES solar illuminated hexagonal cell scan mirror back
NASA Technical Reports Server (NTRS)
Fantano, Louis G.
1993-01-01
A ray tracing analytical tool has been developed for the simulation of spectral radiation exchange in complex systems. Algorithms are used to account for heat source spectral energy, surface directional radiation properties, and surface spectral absorptivity properties. This tool has been used to calculate the effective solar absorptivity of the geostationary operational environmental satellites (GOES) scan mirror in the calibration position. The development and design of Sounder and Imager instruments on board GOES is reviewed and the problem of calculating the effective solar absorptivity associated with the GOES hexagonal cell configuration is presented. The analytical methodology based on the Monte Carlo ray tracing technique is described and results are presented and verified by experimental measurements for selected solar incidence angles.
Laboratory, Field, and Analytical Procedures for Using ...
Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sediments, toxicity testing and bioaccumulation studies performed on site sediments, and application of partitioning, transport and fate modeling. All of these tools served as lines of evidence for making informed environmental management decisions at contaminated sediment sites. In the last ten years, a new tool for assessing remedial effectiveness has gained a great deal of attention. Passive sampling offers a tool capable of measuring the freely dissolved concentration (Cfree) of legacy contaminants in water and sediments. In addition to assessing the effectiveness of the remedy, passive sampling can be applied for a variety of other contaminated sediments site purposes involved with performing the preliminary assessment and site inspection, conducting the remedial investigation and feasibility study, preparing the remedial design, and assessing the potential for contaminant bioaccumulation. While there is a distinct need for using passive sampling at contaminated sediments sites and several previous documents and research articles have discussed various aspects of passive sampling, there has not been definitive guidance on the laboratory, field and analytical procedures for using pas
Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W
2014-12-01
Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.
Kangas, Michael J; Burks, Raychelle M; Atwater, Jordyn; Lukowicz, Rachel M; Garver, Billy; Holmes, Andrea E
2018-02-01
With the increasing availability of digital imaging devices, colorimetric sensor arrays are rapidly becoming a simple, yet effective tool for the identification and quantification of various analytes. Colorimetric arrays utilize colorimetric data from many colorimetric sensors, with the multidimensional nature of the resulting data necessitating the use of chemometric analysis. Herein, an 8 sensor colorimetric array was used to analyze select acid and basic samples (0.5 - 10 M) to determine which chemometric methods are best suited for classification quantification of analytes within clusters. PCA, HCA, and LDA were used to visualize the data set. All three methods showed well-separated clusters for each of the acid or base analytes and moderate separation between analyte concentrations, indicating that the sensor array can be used to identify and quantify samples. Furthermore, PCA could be used to determine which sensors showed the most effective analyte identification. LDA, KNN, and HQI were used for identification of analyte and concentration. HQI and KNN could be used to correctly identify the analytes in all cases, while LDA correctly identified 95 of 96 analytes correctly. Additional studies demonstrated that controlling for solvent and image effects was unnecessary for all chemometric methods utilized in this study.
Regardless of the remedial technology invoked to address contaminated sediments in the environment, there is a critical need to have tools for assessing the effectiveness of the remedy. In the past, these tools have included chemical and biomonitoring of the water column and sedi...
Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.
2016-01-01
The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.
Analytic Scattering and Refraction Models for Exoplanet Transit Spectra
NASA Astrophysics Data System (ADS)
Robinson, Tyler D.; Fortney, Jonathan J.; Hubbard, William B.
2017-12-01
Observations of exoplanet transit spectra are essential to understanding the physics and chemistry of distant worlds. The effects of opacity sources and many physical processes combine to set the shape of a transit spectrum. Two such key processes—refraction and cloud and/or haze forward-scattering—have seen substantial recent study. However, models of these processes are typically complex, which prevents their incorporation into observational analyses and standard transit spectrum tools. In this work, we develop analytic expressions that allow for the efficient parameterization of forward-scattering and refraction effects in transit spectra. We derive an effective slant optical depth that includes a correction for forward-scattered light, and present an analytic form of this correction. We validate our correction against a full-physics transit spectrum model that includes scattering, and we explore the extent to which the omission of forward-scattering effects may bias models. Also, we verify a common analytic expression for the location of a refractive boundary, which we express in terms of the maximum pressure probed in a transit spectrum. This expression is designed to be easily incorporated into existing tools, and we discuss how the detection of a refractive boundary could help indicate the background atmospheric composition by constraining the bulk refractivity of the atmosphere. Finally, we show that opacity from Rayleigh scattering and collision-induced absorption will outweigh the effects of refraction for Jupiter-like atmospheres whose equilibrium temperatures are above 400-500 K.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Analytical tools for the analysis of β-carotene and its degradation products
Stutz, H.; Bresgen, N.; Eckl, P. M.
2015-01-01
Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation. PMID:25867077
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics
ERIC Educational Resources Information Center
Rangel, Virginia Snodgrass; Bell, Elizabeth R.; Monroy, Carlos; Whitaker, J. Reid
2015-01-01
Understanding how an educational intervention is implemented is essential to evaluating its effectiveness. With the increased use of digital tools in classrooms, however, traditional methods of measuring implementation fall short. Fortunately, there is a way to learn about the interactions that users have with digital tools that are embedded into…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Poco, Jorge; Bertini, Enrico
2016-01-01
The gap between large-scale data production rate and the rate of generation of data-driven scientific insights has led to an analytical bottleneck in scientific domains like climate, biology, etc. This is primarily due to the lack of innovative analytical tools that can help scientists efficiently analyze and explore alternative hypotheses about the data, and communicate their findings effectively to a broad audience. In this paper, by reflecting on a set of successful collaborative research efforts between with a group of climate scientists and visualization researchers, we introspect how interactive visualization can help reduce the analytical bottleneck for domain scientists.
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
Analytical Chemistry in the Regulatory Science of Medical Devices.
Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott
2018-06-12
In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.
Is Working Memory Training Effective? A Meta-Analytic Review
ERIC Educational Resources Information Center
Melby-Lervag, Monica; Hulme, Charles
2013-01-01
It has been suggested that working memory training programs are effective both as treatments for attention-deficit/hyperactivity disorder (ADHD) and other cognitive disorders in children and as a tool to improve cognitive ability and scholastic attainment in typically developing children and adults. However, effects across studies appear to be…
A results-based process for evaluation of diverse visual analytics tools
NASA Astrophysics Data System (ADS)
Rubin, Gary; Berger, David H.
2013-05-01
With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.
A Progressive Approach to Teaching Analytics in the Marketing Curriculum
ERIC Educational Resources Information Center
Liu, Yiyuan; Levin, Michael A.
2018-01-01
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.
Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd
2018-07-01
Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.
Reaction-based small-molecule fluorescent probes for chemoselective bioimaging
Chan, Jefferson; Dodani, Sheel C.; Chang, Christopher J.
2014-01-01
The dynamic chemical diversity of elements, ions and molecules that form the basis of life offers both a challenge and an opportunity for study. Small-molecule fluorescent probes can make use of selective, bioorthogonal chemistries to report on specific analytes in cells and in more complex biological specimens. These probes offer powerful reagents to interrogate the physiology and pathology of reactive chemical species in their native environments with minimal perturbation to living systems. This Review presents a survey of tools and tactics for using such probes to detect biologically important chemical analytes. We highlight design criteria for effective chemical tools for use in biological applications as well as gaps for future exploration. PMID:23174976
2005-01-01
Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB. PMID:16046824
Experimental and analytical tools for evaluation of Stirling engine rod seal behavior
NASA Technical Reports Server (NTRS)
Krauter, A. I.; Cheng, H. S.
1979-01-01
The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.
Analytics for Cyber Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd.; Kolda, Tamara Gibson
2011-06-01
This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.
Solar Data and Tools: Resources for Researchers, Industry, and Developers
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-01
In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
NASA Astrophysics Data System (ADS)
Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.
1991-08-01
Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.
Focal Event, Contextualization, and Effective Communication in the Mathematics Classroom
ERIC Educational Resources Information Center
Nilsson, Per; Ryve, Andreas
2010-01-01
The aim of this article is to develop analytical tools for studying mathematical communication in collaborative activities. The theoretical construct of contextualization is elaborated methodologically in order to study diversity in individual thinking in relation to effective communication. The construct of contextualization highlights issues of…
Mining Mathematics in Textbook Lessons
ERIC Educational Resources Information Center
Ronda, Erlina; Adler, Jill
2017-01-01
In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…
Fire behavior modeling-a decision tool
Jack Cohen; Bill Bradshaw
1986-01-01
The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...
Guidance for the Design and Adoption of Analytic Tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandlow, Alisa
2015-12-01
The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.
NASA Technical Reports Server (NTRS)
Liu, F. C.
1986-01-01
The objective of this investigation is to make analytical determination of the acceleration produced by crew motion in an orbiting space station and define design parameters for the suspension system of microgravity experiments. A simple structural model for simulation of the IOC space station is proposed. Mathematical formulation of this model provides the engineers a simple and direct tool for designing an effective suspension system.
Single Cell Proteomics in Biomedicine: High-dimensional Data Acquisition, Visualization and Analysis
Su, Yapeng; Shi, Qihui; Wei, Wei
2017-01-01
New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. PMID:28128880
Consequences of Base Time for Redundant Signals Experiments
Townsend, James T.; Honey, Christopher
2007-01-01
We report analytical and computational investigations into the effects of base time on the diagnosticity of two popular theoretical tools in the redundant signals literature: (1) the race model inequality and (2) the capacity coefficient. We show analytically and without distributional assumptions that the presence of base time decreases the sensitivity of both of these measures to model violations. We further use simulations to investigate the statistical power model selection tools based on the race model inequality, both with and without base time. Base time decreases statistical power, and biases the race model test toward conservatism. The magnitude of this biasing effect increases as we increase the proportion of total reaction time variance contributed by base time. We marshal empirical evidence to suggest that the proportion of reaction time variance contributed by base time is relatively small, and that the effects of base time on the diagnosticity of our model-selection tools are therefore likely to be minor. However, uncertainty remains concerning the magnitude and even the definition of base time. Experimentalists should continue to be alert to situations in which base time may contribute a large proportion of the total reaction time variance. PMID:18670591
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
ERIC Educational Resources Information Center
Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.
2013-01-01
The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…
Ferranti, Jeffrey M; Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif
2010-01-01
Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems.
Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif
2010-01-01
Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems. PMID:20190055
Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data
NASA Astrophysics Data System (ADS)
Jern, Mikael
Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.
DOT National Transportation Integrated Search
2015-07-01
Sustainability means providing for the necessities of today without endangering the necessities of tomorrow within the technical, environmental, economic, social/cultural, and individual contexts. However, the assessment tools available to study the ...
Cretini, K.F.; Steyer, G.D.
2011-01-01
The Coastwide Reference Monitoring System (CRMS) program was established to assess the effectiveness of individual coastal restoration projects and the cumulative effects of multiple projects at regional and coastwide scales. In order to make these assessments, analytical teams have been assembled for each of the primary data types sampled under the CRMS program, including vegetation, hydrology, landscape, and soils. These teams consist of scientists and support staff from the U.S. Geological Survey and other Federal agencies, the Louisiana Office of Coastal Protection and Restoration, and university academics. Each team is responsible for developing or identifying parameters, indices, or tools that can be used to assess coastal wetlands at various scales. The CRMS Vegetation Analytical Team has developed a Floristic Quality Index for coastal Louisiana to determine the quality of a wetland based on its plant species composition and abundance.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
Code of Federal Regulations, 2011 CFR
2011-01-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham
2007-01-01
This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.
EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayden, D. W.
This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less
Development and Calibration of Highway Safety Manual Equations for Florida Conditions
DOT National Transportation Integrated Search
2011-08-31
The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for quantifying the potential effects on crashes as a result of decisions made in planning, design, operations, and maintenance. Implementation of the new te...
Development and calibration of highway safety manual equations for Florida conditions.
DOT National Transportation Integrated Search
2011-08-31
The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for : quantifying the potential effects on crashes as a result of decisions made in planning, design, : operations, and maintenance. Implementation of the ne...
In-flight Evaluation of Aerodynamic Predictions of an Air-launched Space Booster
NASA Technical Reports Server (NTRS)
Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan
1992-01-01
Several analytical aerodynamic design tools that were applied to the Pegasus (registered trademark) air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which the design margins may be more stringent.
In-flight evaluation of aerodynamic predictions of an air-launched space booster
NASA Technical Reports Server (NTRS)
Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan
1993-01-01
Several analytical aerodynamic design tools that were applied to the Pegasus air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which design margins may be more stringent.
Total Quality Management (TQM), an Overview
1991-09-01
Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.
Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa
2016-05-01
The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Process Damping and Cutting Tool Geometry in Machining
NASA Astrophysics Data System (ADS)
Taylor, C. M.; Sims, N. D.; Turner, S.
2011-12-01
Regenerative vibration, or chatter, limits the performance of machining processes. Consequences of chatter include tool wear and poor machined surface finish. Process damping by tool-workpiece contact can reduce chatter effects and improve productivity. Process damping occurs when the flank (also known as the relief face) of the cutting tool makes contact with waves on the workpiece surface, created by chatter motion. Tool edge features can act to increase the damping effect. This paper examines how a tool's edge condition combines with the relief angle to affect process damping. An analytical model of cutting with chatter leads to a two-section curve describing how process damped vibration amplitude changes with surface speed for radiussed tools. The tool edge dominates the process damping effect at the lowest surface speeds, with the flank dominating at higher speeds. A similar curve is then proposed regarding tools with worn edges. Experimental data supports the notion of the two-section curve. A rule of thumb is proposed which could be useful to machine operators, regarding tool wear and process damping. The question is addressed, should a tool of a given geometry, used for a given application, be considered as sharp, radiussed or worn regarding process damping.
Explorative visual analytics on interval-based genomic data and their metadata.
Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano
2017-12-04
With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.
e-Healthcare in India: critical success factors for sustainable health systems.
Taneja, Udita; Sushil
2007-01-01
As healthcare enterprises seek to move towards an integrated, sustainable healthcare delivery model an IT-enabled or e-Healthcare strategy is being increasingly adopted. In this study we identified the critical success factors influencing the effectiveness of an e-Healthcare strategy in India. The performance assessment criteria used to measure effectiveness were increasing reach and reducing cost of healthcare delivery. A survey of healthcare providers was conducted. Analytic Hierarchy Process (AHP) and Interpretive Structural Modeling (ISM) were the analytical tools used to determine the relative importance of the critical success factors in influencing effectiveness of e-Healthcare and their interplay with each other. To succeed in e-Healthcare initiatives the critical success factors that need to be in place are appropriate government policies, literacy levels, and telecommunications and power infrastructure in the country. The focus should not be on the IT tools and biomedical engineering technologies as is most often the case. Instead the nontechnology factors such as healthcare provider and consumer mindsets should be addressed to increase acceptance of, and enhance the effectiveness of, sustainable e-Healthcare services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, David S.
Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-01-01
Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006
Davis, Thomas D
2017-01-01
Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.
DOT National Transportation Integrated Search
2015-01-01
The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.
Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin
2016-02-15
Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.
Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data
NASA Astrophysics Data System (ADS)
Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.
2014-12-01
Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation
Perspectives on making big data analytics work for oncology.
El Naqa, Issam
2016-12-01
Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from incorporating prior knowledge, using information-theoretic techniques to modern ensemble machine learning approaches or combination of these. We will particularly discuss the pros and cons of different approaches to improve mining of big data in oncology. Copyright © 2016 Elsevier Inc. All rights reserved.
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
Waaijer, Cathelijn J F; Palmblad, Magnus
2015-01-01
In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.
High precision analytical description of the allowed β spectrum shape
NASA Astrophysics Data System (ADS)
Hayen, Leendert; Severijns, Nathal; Bodek, Kazimierz; Rozpedzik, Dagmara; Mougeot, Xavier
2018-01-01
A fully analytical description of the allowed β spectrum shape is given in view of ongoing and planned measurements. Its study forms an invaluable tool in the search for physics beyond the standard electroweak model and the weak magnetism recoil term. Contributions stemming from finite size corrections, mass effects, and radiative corrections are reviewed. Particular focus is placed on atomic and chemical effects, where the existing description is extended and analytically provided. The effects of QCD-induced recoil terms are discussed, and cross-checks were performed for different theoretical formalisms. Special attention was given to a comparison of the treatment of nuclear structure effects in different formalisms. Corrections were derived for both Fermi and Gamow-Teller transitions, and methods of analytical evaluation thoroughly discussed. In its integrated form, calculated f values were in agreement with the most precise numerical results within the aimed for precision. The need for an accurate evaluation of weak magnetism contributions was stressed, and the possible significance of the oft-neglected induced pseudoscalar interaction was noted. Together with improved atomic corrections, an analytical description was presented of the allowed β spectrum shape accurate to a few parts in 10-4 down to 1 keV for low to medium Z nuclei, thereby extending the work by previous authors by nearly an order of magnitude.
Size, weight and position: ion mobility spectrometry and imaging MS combined.
Kiss, András; Heeren, Ron M A
2011-03-01
Size, weight and position are three of the most important parameters that describe a molecule in a biological system. Ion mobility spectrometry is capable of separating molecules on the basis of their size or shape, whereas imaging mass spectrometry is an effective tool to measure the molecular weight and spatial distribution of molecules. Recent developments in both fields enabled the combination of the two technologies. As a result, ion-mobility-based imaging mass spectrometry is gaining more and more popularity as a (bio-)analytical tool enabling the determination of the size, weight and position of several molecules simultaneously on biological surfaces. This paper reviews the evolution of ion-mobility-based imaging mass spectrometry and provides examples of its application in analytical studies of biological surfaces.
Stochastic modelling of the hydrologic operation of rainwater harvesting systems
NASA Astrophysics Data System (ADS)
Guo, Rui; Guo, Yiping
2018-07-01
Rainwater harvesting (RWH) systems are an effective low impact development practice that provides both water supply and runoff reduction benefits. A stochastic modelling approach is proposed in this paper to quantify the water supply reliability and stormwater capture efficiency of RWH systems. The input rainfall series is represented as a marked Poisson process and two typical water use patterns are analytically described. The stochastic mass balance equation is solved analytically, and based on this, explicit expressions relating system performance to system characteristics are derived. The performances of a wide variety of RWH systems located in five representative climatic regions of the United States are examined using the newly derived analytical equations. Close agreements between analytical and continuous simulation results are shown for all the compared cases. In addition, an analytical equation is obtained expressing the required storage size as a function of the desired water supply reliability, average water use rate, as well as rainfall and catchment characteristics. The equations developed herein constitute a convenient and effective tool for sizing RWH systems and evaluating their performances.
Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Control/structure interaction conceptual design tool
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1990-01-01
The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.
Active controls: A look at analytical methods and associated tools
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.
1984-01-01
A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.
Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery
Using Learning Analytics to Support Engagement in Collaborative Writing
ERIC Educational Resources Information Center
Liu, Ming; Pardo, Abelardo; Liu, Li
2017-01-01
Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…
Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis
ERIC Educational Resources Information Center
Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay
2018-01-01
Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…
Su, Yapeng; Shi, Qihui; Wei, Wei
2017-02-01
New insights on cellular heterogeneity in the last decade provoke the development of a variety of single cell omics tools at a lightning pace. The resultant high-dimensional single cell data generated by these tools require new theoretical approaches and analytical algorithms for effective visualization and interpretation. In this review, we briefly survey the state-of-the-art single cell proteomic tools with a particular focus on data acquisition and quantification, followed by an elaboration of a number of statistical and computational approaches developed to date for dissecting the high-dimensional single cell data. The underlying assumptions, unique features, and limitations of the analytical methods with the designated biological questions they seek to answer will be discussed. Particular attention will be given to those information theoretical approaches that are anchored in a set of first principles of physics and can yield detailed (and often surprising) predictions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Kelly, Nick; Thompson, Kate; Yeoman, Pippa
2015-01-01
This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
[Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].
Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou
2014-08-01
In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).
Eric J. Gustafson; L. Jay Roberts; Larry A. Leefers
2006-01-01
Forest management planners require analytical tools to assess the effects of alternative strategies on the sometimes disparate benefits from forests such as timber production and wildlife habitat. We assessed the spatial patterns of alternative management strategies by linking two models that were developed for different purposes. We used a linear programming model (...
Effect of Virtual Analytical Chemistry Laboratory on Enhancing Student Research Skills and Practices
ERIC Educational Resources Information Center
Bortnik, Boris; Stozhko, Natalia; Pervukhina, Irina; Tchernysheva, Albina; Belysheva, Galina
2017-01-01
This article aims to determine the effect of a virtual chemistry laboratory on university student achievement. The article describes a model of a laboratory course that includes a virtual component. This virtual component is viewed as a tool of student pre-lab autonomous learning. It presents electronic resources designed for a virtual laboratory…
Environmental corrections of a dual-induction logging while drilling tool in vertical wells
NASA Astrophysics Data System (ADS)
Kang, Zhengming; Ke, Shizhen; Jiang, Ming; Yin, Chengfang; Li, Anzong; Li, Junjian
2018-04-01
With the development of Logging While Drilling (LWD) technology, dual-induction LWD logging is not only widely applied in deviated wells and horizontal wells, but it is used commonly in vertical wells. Accordingly, it is necessary to simulate the response of LWD tools in vertical wells for logging interpretation. In this paper, the investigation characteristics, the effects of the tool structure, skin effect and drilling environment of a dual-induction LWD tool are simulated by the three-dimensional (3D) finite element method (FEM). In order to closely simulate the actual situation, real structure of the tool is taking into account. The results demonstrate that the influence of the background value of the tool structure can be eliminated. The values of deducting the background of a tool structure and analytical solution have a quantitative agreement in homogeneous formations. The effect of measurement frequency could be effectively eliminated by chart of skin effect correction. In addition, the measurement environment, borehole size, mud resistivity, shoulder bed, layer thickness and invasion, have an effect on the true resistivity. To eliminate these effects, borehole correction charts, shoulder bed correction charts and tornado charts are computed based on real tool structure. Based on correction charts, well logging data can be corrected automatically by a suitable interpolation method, which is convenient and fast. Verified with actual logging data in vertical wells, this method could obtain the true resistivity of formation.
van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
Trends in Process Analytical Technology: Present State in Bioprocessing.
Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian
2017-08-04
Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.
Mechanical and Electronic Approaches to Improve the Sensitivity of Microcantilever Sensors
Mutyala, Madhu Santosh Ku; Bandhanadham, Deepika; Pan, Liu; Pendyala, Vijaya Rohini; Ji, Hai-Feng
2010-01-01
Advances in the field of Micro Electro Mechanical Systems (MEMS) and their uses now offer unique opportunities in the design of ultrasensitive analytical tools. The analytical community continues to search for cost-effective, reliable, and even portable analytical techniques that can give reliable and fast response results for a variety of chemicals and biomolecules. Microcantilevers (MCLs) have emerged as a unique platform for label-free biosensor or bioassay. Several electronic designs, including piezoresistive, piezoelectric, and capacitive approaches, have been applied to measure the bending or frequency change of the MCLs upon exposure to chemicals. This review summarizes mechanical, fabrication, and electronics approaches to increase the sensitivity of microcantilever (MCL) sensors. PMID:20975987
Atkinson, Jo-An; Page, Andrew; Wells, Robert; Milat, Andrew; Wilson, Andrew
2015-03-03
In the design of public health policy, a broader understanding of risk factors for disease across the life course, and an increasing awareness of the social determinants of health, has led to the development of more comprehensive, cross-sectoral strategies to tackle complex problems. However, comprehensive strategies may not represent the most efficient or effective approach to reducing disease burden at the population level. Rather, they may act to spread finite resources less intensively over a greater number of programs and initiatives, diluting the potential impact of the investment. While analytic tools are available that use research evidence to help identify and prioritise disease risk factors for public health action, they are inadequate to support more targeted and effective policy responses for complex public health problems. This paper discusses the limitations of analytic tools that are commonly used to support evidence-informed policy decisions for complex problems. It proposes an alternative policy analysis tool which can integrate diverse evidence sources and provide a platform for virtual testing of policy alternatives in order to design solutions that are efficient, effective, and equitable. The case of suicide prevention in Australia is presented to demonstrate the limitations of current tools to adequately inform prevention policy and discusses the utility of the new policy analysis tool. In contrast to popular belief, a systems approach takes a step beyond comprehensive thinking and seeks to identify where best to target public health action and resources for optimal impact. It is concerned primarily with what can be reasonably left out of strategies for prevention and can be used to explore where disinvestment may occur without adversely affecting population health (or equity). Simulation modelling used for policy analysis offers promise in being able to better operationalise research evidence to support decision making for complex problems, improve targeting of public health policy, and offers a foundation for strengthening relationships between policy makers, stakeholders, and researchers.
NASA Astrophysics Data System (ADS)
Bradie, Johanna; Gianoli, Claudio; He, Jianjun; Lo Curto, Alberto; Stehouwer, Peter; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah
2018-03-01
Non-indigenous species seriously threaten native biodiversity. To reduce establishments, the International Maritime Organization established the Convention for the Control and Management of Ships' Ballast Water and Sediments which limits organism concentrations at discharge under regulation D-2. Most ships will comply by using on-board treatment systems to disinfect their ballast water. Port state control officers will need simple, rapid methods to detect compliance. Appropriate monitoring methods may be dependent on treatment type, since different treatments will affect organisms by a variety of mechanisms. Many indicative tools have been developed, but must be examined to ensure the measured variable is an appropriate signal for the response of the organisms to the applied treatment. We assessed the abilities of multiple analytic tools to rapidly detect the effects of a ballast water treatment system based on UV disinfection. All devices detected a large decrease in the concentrations of vital organisms ≥ 50 μm and organisms < 10 μm (mean 82.7-99.7% decrease across devices), but results were more variable for the ≥ 10 to < 50 μm size class (mean 9.0-99.9% decrease across devices). Results confirm the necessity to choose tools capable of detecting the damage inflicted on living organisms, as examined herein for UV-C treatment systems.
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-04-01
The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Visualization techniques for computer network defense
NASA Astrophysics Data System (ADS)
Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew
2011-06-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.
Coorssen, Jens R; Yergey, Alfred L
2015-12-03
Molecular mechanisms underlying health and disease function at least in part based on the flexibility and fine-tuning afforded by protein isoforms and post-translational modifications. The ability to effectively and consistently resolve these protein species or proteoforms, as well as assess quantitative changes is therefore central to proteomic analyses. Here we discuss the pros and cons of currently available and developing analytical techniques from the perspective of the full spectrum of available tools and their current applications, emphasizing the concept of fitness-for-purpose in experimental design based on consideration of sample size and complexity; this necessarily also addresses analytical reproducibility and its variance. Data quality is considered the primary criterion, and we thus emphasize that the standards of Analytical Chemistry must apply throughout any proteomic analysis.
Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.
Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie
2018-01-01
As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.
NASA Astrophysics Data System (ADS)
Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie
2018-01-01
As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.
Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459
Code of Federal Regulations, 2014 CFR
2014-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
(Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research
ERIC Educational Resources Information Center
Quiñones, Sandra
2016-01-01
Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…
NASA Technical Reports Server (NTRS)
Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.
1973-01-01
Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.
Challenges and Opportunities in Analysing Students Modelling
ERIC Educational Resources Information Center
Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín
2017-01-01
Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…
ERIC Educational Resources Information Center
Reinholz, Daniel L.; Shah, Niral
2018-01-01
Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…
Managing Transnational Education: Does National Culture Really Matter?
ERIC Educational Resources Information Center
Eldridge, Kaye; Cranston, Neil
2009-01-01
This article reports on an exploratory study that examined the effect of national culture upon the management of Australia's provision of transnational higher education in Thailand. In particular, using Hofstede's national cultural value dimensions as an analytical tool, interviews with managers responsible for Australia's provision of…
Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon
2015-01-01
Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651
NASA Astrophysics Data System (ADS)
Qiao, Yaojun; Li, Ming; Yang, Qiuhong; Xu, Yanfei; Ji, Yuefeng
2015-01-01
Closed-form expressions of nonlinear interference of dense wavelength-division-multiplexed (WDM) systems with dispersion managed transmission (DMT) are derived. We carry out a simulative validation by addressing an ample and significant set of the Nyquist-WDM systems based on polarization multiplexed quadrature phase-shift keying (PM-QPSK) subcarriers at a baud rate of 32 Gbaud per channel. Simulation results show the simple closed-form analytical expressions can provide an effective tool for the quick and accurate prediction of system performance in DMT coherent optical systems.
ERIC Educational Resources Information Center
Aleid, Alkhamsah Saleh
2016-01-01
This study aimed to examine the effectiveness of student extracurricular activities in evaluating violent behavior among students in the preparatory year at Hail University. The researcher used the descriptive analytical method, and used two tools for the purpose of the study, the study sample consisted of 104 (violent) female students from the…
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Quality Indicators for Learning Analytics
ERIC Educational Resources Information Center
Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus
2014-01-01
This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains
Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John
2015-08-18
This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less
Big Data Management in US Hospitals: Benefits and Barriers.
Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto
Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.
Self-Directed Learning: A Tool for Lifelong Learning
ERIC Educational Resources Information Center
Boyer, Stefanie L.; Edmondson, Diane R.; Artis, Andrew B.; Fleming, David
2014-01-01
A meta-analytic review of self-directed learning (SDL) research over 30 years, five countries, and across multiple academic disciplines is used to explore its relationships with five key nomologically related constructs for effective workplace learning. The meta-analysis revealed positive relationships between SDL and internal locus of control,…
Corpus Use in Language Learning: A Meta-Analysis
ERIC Educational Resources Information Center
Boulton, Alex; Cobb, Tom
2017-01-01
This study applied systematic meta-analytic procedures to summarize findings from experimental and quasi-experimental investigations into the effectiveness of using the tools and techniques of corpus linguistics for second language learning or use, here referred to as data-driven learning (DDL). Analysis of 64 separate studies representing 88…
ERIC Educational Resources Information Center
Samsa, Gregory P.
2018-01-01
Collaborative biostatistics is the creative application of statistical tools to biomedical problems. The relatively modest literature about the traits of effective collaborative biostatisticians focuses on four core competencies: (a) technical and analytical; (b) substance-matter knowledge; (c) communication; and (d) problem solving and problem…
Study of high altitude plume impingement
NASA Technical Reports Server (NTRS)
Wojciechowski, C. J.; Penny, M. M.; Prozan, R. J.; Seymour, D.; Greenwood, T. F.
1972-01-01
Computer program has been developed as analytical tool to predict severity of effects of exhaust of rocket engines on adjacent spacecraft surfaces. Program computes forces, moments, pressures, and heating rates on surfaces immersed in or subjected to exhaust plume environments. Predictions will be useful in design of systems where such problems are anticipated.
Implementation and Use of the Reference Analytics Module of LibAnswers
ERIC Educational Resources Information Center
Flatley, Robert; Jensen, Robert Bruce
2012-01-01
Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-08
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
ERIC Educational Resources Information Center
Davis, Gary Alan; Woratschek, Charles R.
2015-01-01
Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…
Conceptual Design Study on Bolts for Self-Loosing Preventable Threaded Fasteners
NASA Astrophysics Data System (ADS)
Noma, Atsushi; He, Jianmei
2017-11-01
Threaded fasteners using bolts is widely applied in industrial field as well as various fields. However, threaded fasteners using bolts have loosing problems and cause many accidents. In this study, the purpose is to obtain self-loosing preventable threaded fasteners by applying spring characteristic effects on bolt structures. Helical-cutting applied bolt structures is introduced through three dimensional (3D) CAD modeling tools. Analytical approaches for evaluations on the spring characteristic effects helical-cutting applied bolt structures and self-loosing preventable performance of threaded fasteners were performed using finite element method and results are reported. Comparing slackness test results with analytical results and more details on evaluating mechanical properties will be executed in future study.
Lu, Ning; Ge, Shemin
1996-01-01
By including the constant flow of heat and fluid in the horizontal direction, we develop an analytical solution for the vertical temperature distribution within the semiconfining layer of a typical aquifer system. The solution is an extension of the previous one-dimensional theory by Bredehoeft and Papadopulos [1965]. It provides a quantitative tool for analyzing the uncertainty of the horizontal heat and fluid flow. The analytical results demonstrate that horizontal flow of heat and fluid, if at values much smaller than those of the vertical, has a negligible effect on the vertical temperature distribution but becomes significant when it is comparable to the vertical.
2015-01-01
Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893
Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.
Endert, A; Fiaux, P; North, C
2012-12-01
Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.
Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra
2018-02-01
The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
Krumholz, Harlan M
2014-07-01
Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give those data meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This article explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately utilized, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. Project HOPE—The People-to-People Health Foundation, Inc.
Krumholz, Harlan M.
2017-01-01
Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give it meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This paper explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately used, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. PMID:25006142
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Kuby, Candace R.
2014-01-01
An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…
Analytical Tools for Affordability Analysis
2015-05-01
function (Womer) Unit cost as a function of learning and rate Learning with forgetting (Benkard) Learning depreciates over time Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION
Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft
2013-03-01
imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89 B. FUTURE WORK................................................................................. 90 APPENDIX A. STK DATA AND BENEFIT
ERIC Educational Resources Information Center
Kilpatrick, Sue; Field, John; Falk, Ian
The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…
ERIC Educational Resources Information Center
Paranosic, Nikola; Riveros, Augusto
2017-01-01
This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…
Analytical Tools for Behavioral Influences Operations
2003-12-01
NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing
2013-06-01
realistically representing the world in a simulation environment. A screenshot of the combat model used for this research is shown below. There are six...changes in use of technology (Ryan & Jons, 1992). Cost effectiveness and operational effectiveness are important, and it is extremely hard to achieve...effectiveness of ships using simulation and analytical models, to create a ship synthesis model, and most importantly, to develop decision making tools
Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brennan, Dylan; Miller, G. P.
This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physicsmore » determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.« less
Tomaselli Muensterman, Elena; Tisdale, James E
2018-06-08
Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
ERIC Educational Resources Information Center
Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie
2016-01-01
Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…
Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A
2018-01-01
Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.
Simulating the consequences of land management.
Jonathan. Thompson
2007-01-01
How do you project the effects of management decisions made today on future conditions of riparian forests, stream habitat, and fish abundance in the streams and rivers of the interior Columbia Basin? Researchers at PNW Research Station have developed some novel analytical tools to help answer this question. Their work is part of the Interior Northwest Landscape...
Technology Rich Biology Labs: Effects of Misconceptions.
ERIC Educational Resources Information Center
Kuech, Robert; Zogg, Gregory; Zeeman, Stephan; Johnson, Mark
This paper describes a study conducted on the lab sections of the general biology course for non-science majors at the University of New England, and reports findings of student misconceptions about photosynthesis and the mass/carbon uptake during plant growth. The current study placed high technology analytic tools in the hands of introductory…
Economic Analysis of Education: A Conceptual Framework. Theoretical Paper No. 68.
ERIC Educational Resources Information Center
Rossmiller, Richard A.; Geske, Terry G.
This paper discusses several concepts and techniques from the areas of systems theory and economic analysis that can be used as tools in an effort to improve the productivity of the educational enterprise. Several studies investigating productivity in education are reviewed, and the analytical problems in conducting cost-effectiveness studies are…
An Integrated Customer Knowledge Management Framework for Academic Libraries
ERIC Educational Resources Information Center
Daneshgar, Farhad; Parirokh, Mehri
2012-01-01
The ability of academic libraries to produce timely and effective responses to various environmental changes constitutes a major challenge for them to enhance their survival rate and maintain growth in competitive environments. This article provides a conceptual model as an analytical tool for both improving current services as well as creating…
The effect of blurred plot coordinates on interpolating forest biomass: a case study
J. W. Coulston
2004-01-01
Interpolated surfaces of forest attributes are important analytical tools and have been used in risk assessments, forest inventories, and forest health assessments. The USDA Forest Service Forest Inventory and Analysis program (FIA) annually collects information on forest attributes in a consistent fashion nation-wide. Users of these data typically perform...
Equivalent-circuit models for electret-based vibration energy harvesters
NASA Astrophysics Data System (ADS)
Phu Le, Cuong; Halvorsen, Einar
2017-08-01
This paper presents a complete analysis to build a tool for modelling electret-based vibration energy harvesters. The calculational approach includes all possible effects of fringing fields that may have significant impact on output power. The transducer configuration consists of two sets of metal strip electrodes on a top substrate that faces electret strips deposited on a bottom movable substrate functioning as a proof mass. Charge distribution on each metal strip is expressed by series expansion using Chebyshev polynomials multiplied by a reciprocal square-root form. The Galerkin method is then applied to extract all charge induction coefficients. The approach is validated by finite element calculations. From the analytic tool, a variety of connection schemes for power extraction in slot-effect and cross-wafer configurations can be lumped to a standard equivalent circuit with inclusion of parasitic capacitance. Fast calculation of the coefficients is also obtained by a proposed closed-form solution based on leading terms of the series expansions. The achieved analytical result is an important step for further optimisation of the transducer geometry and maximising harvester performance.
Cheng, Wei; Tian, Jing; Burgunder, Jean-Marc; Hunziker, Walter; Eng, How-Lung
2014-01-01
Myotonia congenita is a human muscle disorder caused by mutations in CLCN1, which encodes human chloride channel 1 (CLCN1). Zebrafish is becoming an increasingly useful model for human diseases, including muscle disorders. In this study, we generated transgenic zebrafish expressing, under the control of a muscle specific promoter, human CLCN1 carrying mutations that have been identified in human patients suffering from myotonia congenita. We developed video analytic tools that are able to provide precise quantitative measurements of movement abnormalities in order to analyse the effect of these CLCN1 mutations on adult transgenic zebrafish swimming. Two new parameters for body-wave kinematics of swimming reveal changes in body curvature and tail offset in transgenic zebrafish expressing the disease-associated CLCN1 mutants, presumably due to their effect on muscle function. The capability of the developed video analytic tool to distinguish wild-type from transgenic zebrafish could provide a useful asset to screen for compounds that reverse the disease phenotype, and may be applicable to other movement disorders besides myotonia congenita.
Cheng, Wei; Tian, Jing; Burgunder, Jean-Marc; Hunziker, Walter; Eng, How-Lung
2014-01-01
Myotonia congenita is a human muscle disorder caused by mutations in CLCN1, which encodes human chloride channel 1 (CLCN1). Zebrafish is becoming an increasingly useful model for human diseases, including muscle disorders. In this study, we generated transgenic zebrafish expressing, under the control of a muscle specific promoter, human CLCN1 carrying mutations that have been identified in human patients suffering from myotonia congenita. We developed video analytic tools that are able to provide precise quantitative measurements of movement abnormalities in order to analyse the effect of these CLCN1 mutations on adult transgenic zebrafish swimming. Two new parameters for body-wave kinematics of swimming reveal changes in body curvature and tail offset in transgenic zebrafish expressing the disease-associated CLCN1 mutants, presumably due to their effect on muscle function. The capability of the developed video analytic tool to distinguish wild-type from transgenic zebrafish could provide a useful asset to screen for compounds that reverse the disease phenotype, and may be applicable to other movement disorders besides myotonia congenita. PMID:25083883
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
Sustainability Tools Inventory Initial Gap Analysis
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Marek, Lukáš; Tuček, Pavel; Pászto, Vít
2015-01-28
Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.
Visualization Techniques for Computer Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaver, Justin M; Steed, Chad A; Patton, Robert M
2011-01-01
Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less
ERIC Educational Resources Information Center
Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon
2017-01-01
The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…
Visual Information for the Desktop, version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-03-29
VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry
Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui
2014-01-01
Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355
Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui
2012-09-18
Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.
Whiting, S D; Guinea, M L; Fomiatti, K; Flint, M; Limpus, C J
2014-06-14
In recent years, the use of blood chemistry as a diagnostic tool for sea turtles has been demonstrated, but much of its effectiveness relies on reference intervals. The first comprehensive blood chemistry values for healthy wild hawksbill (Eretmochelys imbricata) sea turtles are presented. Nineteen blood chemistry analytes and packed cell volume were analysed for 40 clinically healthy juvenile hawksbill sea turtles captured from a rocky reef habitat in northern Australia. We used four statistical approaches to calculate reference intervals and to investigate their use with non-normal distributions and small sample sizes, and to compare upper and lower limits between methods. Eleven analytes were correlated with curved carapace length indicating that body size should be considered when designing future studies and interpreting analyte values. British Veterinary Association.
Epilepsy analytic system with cloud computing.
Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei
2013-01-01
Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.
Analytical thermal model for end-pumped solid-state lasers
NASA Astrophysics Data System (ADS)
Cini, L.; Mackenzie, J. I.
2017-12-01
Fundamentally power-limited by thermal effects, the design challenge for end-pumped "bulk" solid-state lasers depends upon knowledge of the temperature gradients within the gain medium. We have developed analytical expressions that can be used to model the temperature distribution and thermal-lens power in end-pumped solid-state lasers. Enabled by the inclusion of a temperature-dependent thermal conductivity, applicable from cryogenic to elevated temperatures, typical pumping distributions are explored and the results compared with accepted models. Key insights are gained through these analytical expressions, such as the dependence of the peak temperature rise in function of the boundary thermal conductance to the heat sink. Our generalized expressions provide simple and time-efficient tools for parametric optimization of the heat distribution in the gain medium based upon the material and pumping constraints.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
Analysis of Wave Velocity Patterns in Black Cherry Trees and its Effect on Internal Decay Detection
Guanghui Li; Xiping Wang; Jan Wiedenbeck; Robert J. Ross
2013-01-01
In this study, we examined stress wave velocity patterns in the cross sections of black cherry trees, developed analytical models of stress wave velocity in sound healthy trees, and then tested the effectiveness of the models as a tool for tree decay diagnosis. Acoustic tomography data of the tree cross sections were collected from 12 black cherry trees at a production...
Analysis of wave velocity patterns in black cherry trees and its effect on internal decay detection
Guanghui Li; Xiping Wang; Hailin Feng; Jan Wiedenbeck; Robert J. Ross
2014-01-01
In this study, we examined stress wave velocity patterns in the cross sections of black cherry trees, developed analytical models of stress wave velocity in sound healthy trees, and then tested the effectiveness of the models as a tool for tree decay diagnosis. Acoustic tomography data of the tree cross sections were collected from 12 black cherry trees at a production...
ERIC Educational Resources Information Center
Chen, Bodong
2015-01-01
In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
IBM’s Health Analytics and Clinical Decision Support
Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.
2014-01-01
Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736
Rudzki, Piotr J; Gniazdowska, Elżbieta; Buś-Kwaśnik, Katarzyna
2018-06-05
Liquid chromatography coupled to mass spectrometry (LC-MS) is a powerful tool for studying pharmacokinetics and toxicokinetics. Reliable bioanalysis requires the characterization of the matrix effect, i.e. influence of the endogenous or exogenous compounds on the analyte signal intensity. We have compared two methods for the quantitation of matrix effect. The CVs(%) of internal standard normalized matrix factors recommended by the European Medicines Agency were evaluated against internal standard normalized relative matrix effects derived from Matuszewski et al. (2003). Both methods use post-extraction spiked samples, but matrix factors require also neat solutions. We have tested both approaches using analytes of diverse chemical structures. The study did not reveal relevant differences in the results obtained with both calculation methods. After normalization with the internal standard, the CV(%) of the matrix factor was on average 0.5% higher than the corresponding relative matrix effect. The method adopted by the European Medicines Agency seems to be slightly more conservative in the analyzed datasets. Nine analytes of different structures enabled a general overview of the problem, still, further studies are encouraged to confirm our observations. Copyright © 2018 Elsevier B.V. All rights reserved.
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
Haze Gray Paint and the U.S. Navy: A Procurement Process Review
2017-12-01
support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools
2014-01-14
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and
Visualization and Analytics Software Tools for Peregrine System |
R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel
2011-03-28
Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory
Demonstrating Success: Web Analytics and Continuous Improvement
ERIC Educational Resources Information Center
Loftus, Wayne
2012-01-01
As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, F.H.; Borek, T.T.; Christopher, J.Z.
1997-12-01
Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less
Gamification and Smart Feedback: Experiences with a Primary School Level Math App
ERIC Educational Resources Information Center
Kickmeier-Rust, Michael D.; Hillemann, Eva-C.; Albert, Dietrich
2014-01-01
Gamification is a recent trend in the field of game-based learning that accounts for development effort, costs, and effectiveness concerns of games. Another trend in educational technology is learning analytics and formative feedback. In the context of a European project the developed a light weight tool for learning and practicing divisions named…
I PASS: an interactive policy analysis simulation system.
Doug Olson; Con Schallau; Wilbur Maki
1984-01-01
This paper describes an interactive policy analysis simulation system(IPASS) that can be used to analyze the long-term economic and demographic effects of alternative forest resource management policies. The IPASS model is a dynamic analytical tool that forecasts growth and development of an economy. It allows the user to introduce changes in selected parameters based...
Distributed Generation Interconnection Collaborative | NREL
, reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability
Tobiszewski, Marek; Orłowski, Aleksander
2015-03-27
The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of a Subjective Evaluation Tool for Assessing Marksmanship Training Effectiveness
2013-01-28
used to break down the marksmanship domain, as presented in the USMC Rifle Marksmanship Manual, into sub-tasks that were converted into training-task... Mangos for his expertise in survey development and, along with Dr. Joseph Chandler, for guidance on the analysis of a complicated data set; Mr. Clarke...alternative to lengthy and resource-demanding training effectiveness evaluations. A task analytic approach was used to break down the marksmanship domain, as
Yang, Yong; Liu, Yongzhong; Yu, Bo; Ding, Tian
2016-06-01
Volatile contaminants may migrate with carbon dioxide (CO2) injection or leakage in subsurface formations, which leads to the risk of the CO2 storage and the ecological environment. This study aims to develop an analytical model that could predict the contaminant migration process induced by CO2 storage. The analytical model with two moving boundaries is obtained through the simplification of the fully coupled model for the CO2-aqueous phase -stagnant phase displacement system. The analytical solutions are confirmed and assessed through the comparison with the numerical simulations of the fully coupled model. Then, some key variables in the analytical solutions, including the critical time, the locations of the dual moving boundaries and the advance velocity, are discussed to present the characteristics of contaminant migration in the multi-phase displacement system. The results show that these key variables are determined by four dimensionless numbers, Pe, RD, Sh and RF, which represent the effects of the convection, the dispersion, the interphase mass transfer and the retention factor of contaminant, respectively. The proposed analytical solutions could be used for tracking the migration of the injected CO2 and the contaminants in subsurface formations, and also provide an analytical tool for other solute transport in multi-phase displacement system. Copyright © 2016 Elsevier B.V. All rights reserved.
Vibrations and structureborne noise in space station
NASA Technical Reports Server (NTRS)
Vaicaitis, R.
1985-01-01
Theoretical models were developed capable of predicting structural response and noise transmission to random point mechanical loads. Fiber reinforced composite and aluminum materials were considered. Cylindrical shells and circular plates were taken as typical representatives of structural components for space station habitability modules. Analytical formulations include double wall and single wall constructions. Pressurized and unpressurized models were considered. Parametric studies were conducted to determine the effect on structural response and noise transmission due to fiber orientation, point load location, damping in the core and the main load carrying structure, pressurization, interior acoustic absorption, etc. These analytical models could serve as preliminary tools for assessing noise related problems, for space station applications.
AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.
Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil
2015-01-01
Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data.
Electrochemical lectin based biosensors as a label-free tool in glycomics
Bertók, Tomáš; Katrlík, Jaroslav; Gemeiner, Peter; Tkac, Jan
2016-01-01
Glycans and other saccharide moieties attached to proteins and lipids, or present on the surface of a cell, are actively involved in numerous physiological or pathological processes. Their structural flexibility (that is based on the formation of various kinds of linkages between saccharides) is making glycans superb “identity cards”. In fact, glycans can form more “words” or “codes” (i.e., unique sequences) from the same number of “letters” (building blocks) than DNA or proteins. Glycans are physicochemically similar and it is not a trivial task to identify their sequence, or - even more challenging - to link a given glycan to a particular physiological or pathological process. Lectins can recognise differences in glycan compositions even in their bound state and therefore are most useful tools in the task to decipher the “glycocode”. Thus, lectin-based biosensors working in a label-free mode can effectively complement the current weaponry of analytical tools in glycomics. This review gives an introduction into the area of glycomics and then focuses on the design, analytical performance, and practical utility of lectin-based electrochemical label-free biosensors for the detection of isolated glycoproteins or intact cells. PMID:27239071
2014-10-20
three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY
Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery
NASA Astrophysics Data System (ADS)
Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.
2017-12-01
Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.
Analytical calculation on the determination of steep side wall angles from far field measurements
NASA Astrophysics Data System (ADS)
Cisotto, Luca; Pereira, Silvania F.; Urbach, H. Paul
2018-06-01
In the semiconductor industry, the performance and capabilities of the lithographic process are evaluated by measuring specific structures. These structures are often gratings of which the shape is described by a few parameters such as period, middle critical dimension, height, and side wall angle (SWA). Upon direct measurement or retrieval of these parameters, the determination of the SWA suffers from considerable inaccuracies. Although the scattering effects that steep SWAs have on the illumination can be obtained with rigorous numerical simulations, analytical models constitute a very useful tool to get insights into the problem we are treating. In this paper, we develop an approach based on analytical calculations to describe the scattering of a cliff and a ridge with steep SWAs. We also propose a detection system to determine the SWAs of the structures.
Effects of viscosity on shock-induced damping of an initial sinusoidal disturbance
NASA Astrophysics Data System (ADS)
Ma, Xiaojuan; Liu, Fusheng; Jing, Fuqian
2010-05-01
A lack of reliable data treatment method has been for several decades the bottleneck of viscosity measurement by disturbance amplitude damping method of shock waves. In this work the finite difference method is firstly applied to obtain the numerical solutions for disturbance amplitude damping behavior of sinusoidal shock front in inviscid and viscous flow. When water shocked to 15 GPa is taken as an example, the main results are as follows: (1) For inviscid and lower viscous flows the numerical method gives results in good agreement with the analytic solutions under the condition of small disturbance ( a 0/ λ=0.02); (2) For the flow of viscosity beyond 200 Pa s ( η = κ) the analytic solution is found to overestimate obviously the effects of viscosity. It is attributed to the unreal pre-conditions of analytic solution by Miller and Ahrens; (3) The present numerical method provides an effective tool with more confidence to overcome the bottleneck of data treatment when the effects of higher viscosity in experiments of Sakharov and flyer impact are expected to be analyzed, because it can in principle simulate the development of shock waves in flows with larger disturbance amplitude, higher viscosity, and complicated initial flow.
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Milan, Luis Aparecido; Stockton, Amanda M; Carrilho, Emanuel
2017-05-02
Paper-based devices are a portable, user-friendly, and affordable technology that is one of the best analytical tools for inexpensive diagnostic devices. Three-dimensional microfluidic paper-based analytical devices (3D-μPADs) are an evolution of single layer devices and they permit effective sample dispersion, individual layer treatment, and multiplex analytical assays. Here, we present the rational design of a wax-printed 3D-μPAD that enables more homogeneous permeation of fluids along the cellulose matrix than other existing designs in the literature. Moreover, we show the importance of the rational design of channels on these devices using glucose oxidase, peroxidase, and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS) reactions. We present an alternative method for layer stacking using a magnetic apparatus, which facilitates fluidic dispersion and improves the reproducibility of tests performed on 3D-μPADs. We also provide the optimized designs for printing, facilitating further studies using 3D-μPADs.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.
2015-01-01
Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246
Modeling of the Global Water Cycle - Analytical Models
Yongqiang Liu; Roni Avissar
2005-01-01
Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...
NASA Astrophysics Data System (ADS)
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-02-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-01-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
Himes, Sarah K; Scheidweiler, Karl B; Tassiopoulos, Katherine; Kacanek, Deborah; Hazra, Rohan; Rich, Kenneth; Huestis, Marilyn A
2013-02-05
A novel method for the simultaneous quantification of 16 antiretroviral (ARV) drugs and 4 metabolites in meconium was developed and validated. Quantification of 6 nucleoside/nucleotide reverse transcriptase inhibitors, 2 non-nucleoside reverse transcriptase inhibitors, 7 protease inhibitors, and 1 integrase inhibitor was achieved in 0.25 g of meconium. Specimen preparation included methanol homogenization and solid-phase extraction. Separate positive and negative polarity multiple reaction monitoring mode injections were required to achieve sufficient sensitivity. Linearity ranged from 10 to 75 ng/g up to 2500 ng/g for most analytes and 100-500 ng/g up to 25,000 ng/g for some; all correlation coefficients were ≥0.99. Extraction efficiencies from meconium were 32.8-119.5% with analytical recovery of 80.3-108.3% and total imprecision of 2.2-11.0% for all quantitative analytes. Two analytes with analytical recovery (70.0-138.5%) falling outside the 80-120% criteria range were considered semiquantitative. Matrix effects were -98.3-47.0% and -98.0-67.2% for analytes and internal standards, respectively. Analytes were stable (>75%) at room temperature for 24 h, 4 °C for 3 days, -20 °C for 3 freeze-thaw cycles over 3 days, and on the autosampler. Method applicability was demonstrated by analyzing meconium from HIV-uninfected infants born to HIV-positive mothers on ARV therapy. This method can be used as a tool to investigate the potential effects of in utero ARV exposure on childhood health and neurodevelopmental outcomes.
Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.
Dunn, Joshua G; Weissman, Jonathan S
2016-11-22
Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .
2011-01-01
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968
Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin
2011-03-16
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.
Google Analytics – Index of Resources
Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.
Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review
NASA Astrophysics Data System (ADS)
Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal
2017-08-01
Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.
Selection by consequences, behavioral evolution, and the price equation.
Baum, William M
2017-05-01
Price's equation describes evolution across time in simple mathematical terms. Although it is not a theory, but a derived identity, it is useful as an analytical tool. It affords lucid descriptions of genetic evolution, cultural evolution, and behavioral evolution (often called "selection by consequences") at different levels (e.g., individual vs. group) and at different time scales (local and extended). The importance of the Price equation for behavior analysis lies in its ability to precisely restate selection by consequences, thereby restating, or even replacing, the law of effect. Beyond this, the equation may be useful whenever one regards ontogenetic behavioral change as evolutionary change, because it describes evolutionary change in abstract, general terms. As an analytical tool, the behavioral Price equation is an excellent aid in understanding how behavior changes within organisms' lifetimes. For example, it illuminates evolution of response rate, analyses of choice in concurrent schedules, negative contingencies, and dilemmas of self-control. © 2017 Society for the Experimental Analysis of Behavior.
A Bayesian deconvolution strategy for immunoprecipitation-based DNA methylome analysis
Down, Thomas A.; Rakyan, Vardhman K.; Turner, Daniel J.; Flicek, Paul; Li, Heng; Kulesha, Eugene; Gräf, Stefan; Johnson, Nathan; Herrero, Javier; Tomazou, Eleni M.; Thorne, Natalie P.; Bäckdahl, Liselotte; Herberth, Marlis; Howe, Kevin L.; Jackson, David K.; Miretti, Marcos M.; Marioni, John C.; Birney, Ewan; Hubbard, Tim J. P.; Durbin, Richard; Tavaré, Simon; Beck, Stephan
2009-01-01
DNA methylation is an indispensible epigenetic modification of mammalian genomes. Consequently there is great interest in strategies for genome-wide/whole-genome DNA methylation analysis, and immunoprecipitation-based methods have proven to be a powerful option. Such methods are rapidly shifting the bottleneck from data generation to data analysis, necessitating the development of better analytical tools. Until now, a major analytical difficulty associated with immunoprecipitation-based DNA methylation profiling has been the inability to estimate absolute methylation levels. Here we report the development of a novel cross-platform algorithm – Bayesian Tool for Methylation Analysis (Batman) – for analyzing Methylated DNA Immunoprecipitation (MeDIP) profiles generated using arrays (MeDIP-chip) or next-generation sequencing (MeDIP-seq). The latter is an approach we have developed to elucidate the first high-resolution whole-genome DNA methylation profile (DNA methylome) of any mammalian genome. MeDIP-seq/MeDIP-chip combined with Batman represent robust, quantitative, and cost-effective functional genomic strategies for elucidating the function of DNA methylation. PMID:18612301
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
ERIC Educational Resources Information Center
Kishbaugh, Tara L. S.; Cessna, Stephen; Horst, S. Jeanne; Leaman, Lori; Flanagan, Toni; Neufeld, Doug Graber; Siderhurst, Matthew
2012-01-01
Herein we report the development of an analytic rubric bank to assess non-content learning, namely higher order cognitive skills, the understanding of the nature of science, and effective scientific communication skills in student research projects. Preliminary findings indicate that use of this tool enhances our students' learning in these areas,…
David L. Peterson; Daniel L. Schmoldt
1999-01-01
The National Park Service and other public agencies are increasing their emphasis on inventory and monitoring (I&M) programs to obtain the information needed to infer changes in resource conditions and trigger management responses.A few individuals on a planning team can develop I&M programs, although a focused workshop is more effective.Workshops are...
Trajectory Calculator for Finite-Radius Cutter on a Lathe
NASA Technical Reports Server (NTRS)
Savchenkov, Anatoliy; Strekalov, Dmitry; Yu, Nan
2009-01-01
A computer program calculates the two-dimensional trajectory (radial vs. axial position) of a finite-radius-of-curvature cutting tool on a lathe so as to cut a workpiece to a piecewise-continuous, analytically defined surface of revolution. (In the original intended application, the tool is a diamond cutter, and the workpiece is made of a crystalline material and is to be formed into an optical resonator disk.) The program also calculates an optimum cutting speed as F/L, where F is a material-dependent empirical factor and L is the effective instantaneous length of the cutting edge.
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
Genomics Portals: integrative web-platform for mining genomics data.
Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario
2010-01-13
A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.
Genomics Portals: integrative web-platform for mining genomics data
2010-01-01
Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
A predictive pilot model for STOL aircraft landing
NASA Technical Reports Server (NTRS)
Kleinman, D. L.; Killingsworth, W. R.
1974-01-01
An optimal control approach has been used to model pilot performance during STOL flare and landing. The model is used to predict pilot landing performance for three STOL configurations, each having a different level of automatic control augmentation. Model predictions are compared with flight simulator data. It is concluded that the model can be effective design tool for studying analytically the effects of display modifications, different stability augmentation systems, and proposed changes in the landing area geometry.
Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs
ERIC Educational Resources Information Center
Veregin, Howard
2015-01-01
Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…
This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...
The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate
ERIC Educational Resources Information Center
Cárdenas-Navia, Isabel; Fitzgerald, Brian K.
2015-01-01
New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…
SensePath: Understanding the Sensemaking Process Through Analytic Provenance.
Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob
2016-01-01
Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.
Kumar, B Vinodh; Mohan, Thuthi
2018-01-01
Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.
Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less
A workflow learning model to improve geovisual analytics utility
Roth, Robert E; MacEachren, Alan M; McCabe, Craig A
2011-01-01
Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545
A workflow learning model to improve geovisual analytics utility.
Roth, Robert E; Maceachren, Alan M; McCabe, Craig A
2009-01-01
INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2013 CFR
2013-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2014 CFR
2014-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview
ERIC Educational Resources Information Center
Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans
2017-01-01
Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Big data analytics in immunology: a knowledge-based approach.
Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir
2014-01-01
With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.
Update on SLD Engineering Tools Development
NASA Technical Reports Server (NTRS)
Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.
2004-01-01
The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.
76 FR 70517 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
Yang, X I A; Meneveau, C
2017-04-13
In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'. © 2017 The Author(s).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
A Lean Six Sigma approach to the improvement of the selenium analysis method.
Cloete, Bronwyn C; Bester, André
2012-11-02
Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.
Mechanistic characterization of chloride interferences in electrothermal atomization systems
Shekiro, J.M.; Skogerboe, R.K.; Taylor, Howard E.
1988-01-01
A computer-controlled spectrometer with a photodiode array detector has been used for wavelength and temperature resolved characterization of the vapor produced by an electrothermal atomizer. The system has been used to study the chloride matrix interference on the atomic absorption spectrometric determination of manganese and copper. The suppression of manganese and copper atom populations by matrix chlorides such as those of calcium and magnesium is due to the gas-phase formation of an analyte chloride species followed by the diffusion of significant fractions of these species from the atom cell prior to completion of the atomization process. The analyte chloride species cannot be formed when matrix chlorides with metal-chloride bond dissociation energies above those of the analyte chlorides are the principal entitles present. The results indicate that multiple wavelength spectrometry used to obtain temperature-resolved spectra is a viable tool in the mechanistic characterization of interference effects observed with electrothermal atomization systems. ?? 1988 American Chemical Society.
Optimization techniques applied to passive measures for in-orbit spacecraft survivability
NASA Technical Reports Server (NTRS)
Mog, Robert A.; Price, D. Marvin
1991-01-01
Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.
In Vitro and In Vivo SERS Biosensing for Disease Diagnosis.
Moore, T Joshua; Moody, Amber S; Payne, Taylor D; Sarabia, Grace M; Daniel, Alyssa R; Sharma, Bhavya
2018-05-11
For many disease states, positive outcomes are directly linked to early diagnosis, where therapeutic intervention would be most effective. Recently, trends in disease diagnosis have focused on the development of label-free sensing techniques that are sensitive to low analyte concentrations found in the physiological environment. Surface-enhanced Raman spectroscopy (SERS) is a powerful vibrational spectroscopy that allows for label-free, highly sensitive, and selective detection of analytes through the amplification of localized electric fields on the surface of a plasmonic material when excited with monochromatic light. This results in enhancement of the Raman scattering signal, which allows for the detection of low concentration analytes, giving rise to the use of SERS as a diagnostic tool for disease. Here, we present a review of recent developments in the field of in vivo and in vitro SERS biosensing for a range of disease states including neurological disease, diabetes, cardiovascular disease, cancer, and viral disease.
Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman
2011-06-01
This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.
Modeling and Predicting the Stress Relaxation of Composites with Short and Randomly Oriented Fibers
Obaid, Numaira; Sain, Mohini
2017-01-01
The addition of short fibers has been experimentally observed to slow the stress relaxation of viscoelastic polymers, producing a change in the relaxation time constant. Our recent study attributed this effect of fibers on stress relaxation behavior to the interfacial shear stress transfer at the fiber-matrix interface. This model explained the effect of fiber addition on stress relaxation without the need to postulate structural changes at the interface. In our previous study, we developed an analytical model for the effect of fully aligned short fibers, and the model predictions were successfully compared to finite element simulations. However, in most industrial applications of short-fiber composites, fibers are not aligned, and hence it is necessary to examine the time dependence of viscoelastic polymers containing randomly oriented short fibers. In this study, we propose an analytical model to predict the stress relaxation behavior of short-fiber composites where the fibers are randomly oriented. The model predictions were compared to results obtained from Monte Carlo finite element simulations, and good agreement between the two was observed. The analytical model provides an excellent tool to accurately predict the stress relaxation behavior of randomly oriented short-fiber composites. PMID:29053601
SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Wang, R.; Hurysz, B.; Luo, Z.
1991-01-01
The goal is to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom.The approach consists of four steps: (1) develop analytical tools (models and computer programs); (2) conduct parameterization studies; (3) predict the global space station EMI environment; and (4) provide a basis for modification of EMI standards.
A review of recent advances in risk analysis for wildfire management
Carol Miller; Alan A. Ager
2012-01-01
Risk analysis evolved out of the need to make decisions concerning highly stochastic events, and is well suited to analyze the timing, location and potential effects of wildfires. Over the past 10 years, the application of risk analysis to wildland fire management has seen steady growth with new risk-based analytical tools that support a wide range of fire and fuels...
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.
Sánchez, C; Ortega, B; Wei, J L; Tang, J; Capmany, J
2013-03-25
We provide an analytical study on the propagation effects of a directly modulated OOFDM signal through a dispersive fiber and subsequent photo-detection. The analysis includes the effects of the laser operation point and the interplay between chromatic dispersion and laser chirp. The final expression allows to understand the physics behind the transmission of a multi-carrier signal in the presence of residual frequency modulation and the description of the induced intermodulation distortion gives us a detailed insight into the diferent intermodulation products which impair the recovered signal at the receiver-end side. Numerical comparisons between transmission simulations results and those provided by evaluating the expression obtained are carried out for different laser operation points. Results obtained by changing the fiber length, laser parameters and using single mode fiber with negative and positive dispersion are calculated in order to demonstrate the validity and versatility of the theory provided in this paper. Therefore, a novel analytical formulation is presented as a versatile tool for the description and study of IM/DD OOFDM systems with variable design parameters.
Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays
Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.
2017-01-01
Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*
CHAHINE, Georges L.; HSIAO, Chao-Tsung
2012-01-01
Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Software Tools on the Peregrine System | High-Performance Computing | NREL
Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of
Using predictive analytics and big data to optimize pharmaceutical outcomes.
Hernandez, Inmaculada; Zhang, Yuting
2017-09-15
The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
NASTRAN as an analytical research tool for composite mechanics and composite structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.
1976-01-01
Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam
2017-06-01
The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.
ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT
This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...
Structuring modeling and simulation analysis for evacuation planning and operations.
DOT National Transportation Integrated Search
2009-06-01
This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...
Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.
Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà
2017-10-01
Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
SolarPILOT | Concentrating Solar Power | NREL
tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
NASA Astrophysics Data System (ADS)
Tan, Yimin; Lin, Kejian; Zu, Jean W.
2018-05-01
Halbach permanent magnet (PM) array has attracted tremendous research attention in the development of electromagnetic generators for its unique properties. This paper has proposed a generalized analytical model for linear generators. The slotted stator pole-shifting and implementation of Halbach array have been combined for the first time. Initially, the magnetization components of the Halbach array have been determined using Fourier decomposition. Then, based on the magnetic scalar potential method, the magnetic field distribution has been derived employing specially treated boundary conditions. FEM analysis has been conducted to verify the analytical model. A slotted linear PM generator with Halbach PM has been constructed to validate the model and further improved using piece-wise springs to trigger full range reciprocating motion. A dynamic model has been developed to characterize the dynamic behavior of the slider. This analytical method provides an effective tool in development and optimization of Halbach PM generator. The experimental results indicate that piece-wise springs can be employed to improve generator performance under low excitation frequency.
Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín
2017-01-01
Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.
Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R
2017-01-01
Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.
Electronic laboratory notebooks progress and challenges in implementation.
Machina, Hari K; Wild, David J
2013-08-01
Electronic laboratory notebooks (ELNs) are increasingly replacing paper notebooks in life science laboratories, including those in industry, academic settings, and hospitals. ELNs offer significant advantages over paper notebooks, but adopting them in a predominantly paper-based environment is usually disruptive. The benefits of ELN increase when they are integrated with other laboratory informatics tools such as laboratory information management systems, chromatography data systems, analytical instrumentation, and scientific data management systems, but there is no well-established path for effective integration of these tools. In this article, we review and evaluate some of the approaches that have been taken thus far and also some radical new methods of integration that are emerging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Feng; Liu, Yijin; Yu, Xiqian
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
Lin, Feng; Liu, Yijin; Yu, Xiqian; ...
2017-08-30
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
Cost-effectiveness analysis and HIV screening: the emergency medicine perspective.
Hsu, Heather; Walensky, Rochelle P
2011-07-01
Cost-effectiveness analysis is a useful tool for decisionmakers charged with prioritizing of the myriad medical interventions in the emergency department (ED). This analytic approach may be especially helpful for ranking programs that are competing for scarce resources while attempting to maximize net health benefits. In this article, we review the health economics literature on HIV screening in EDs and introduce the methods of cost-effectiveness analysis for medical interventions. We specifically describe the incremental cost-effectiveness ratio--its calculation, the derivation of ratio components, and the interpretation of these ratios. Copyright © 2011. Published by Mosby, Inc.
Awotwe Otoo, David; Agarabi, Cyrus; Khan, Mansoor A
2014-07-01
The aim of the present study was to apply an integrated process analytical technology (PAT) approach to control and monitor the effect of the degree of supercooling on critical process and product parameters of a lyophilization cycle. Two concentrations of a mAb formulation were used as models for lyophilization. ControLyo™ technology was applied to control the onset of ice nucleation, whereas tunable diode laser absorption spectroscopy (TDLAS) was utilized as a noninvasive tool for the inline monitoring of the water vapor concentration and vapor flow velocity in the spool during primary drying. The instantaneous measurements were then used to determine the effect of the degree of supercooling on critical process and product parameters. Controlled nucleation resulted in uniform nucleation at lower degrees of supercooling for both formulations, higher sublimation rates, lower mass transfer resistance, lower product temperatures at the sublimation interface, and shorter primary drying times compared with the conventional shelf-ramped freezing. Controlled nucleation also resulted in lyophilized cakes with more elegant and porous structure with no visible collapse or shrinkage, lower specific surface area, and shorter reconstitution times compared with the uncontrolled nucleation. Uncontrolled nucleation however resulted in lyophilized cakes with relatively lower residual moisture contents compared with controlled nucleation. TDLAS proved to be an efficient tool to determine the endpoint of primary drying. There was good agreement between data obtained from TDLAS-based measurements and SMART™ technology. ControLyo™ technology and TDLAS showed great potential as PAT tools to achieve enhanced process monitoring and control during lyophilization cycles. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Customisation of the exome data analysis pipeline using a combinatorial approach.
Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay
2012-01-01
The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.
Translating statistical species-habitat models to interactive decision support tools
Wszola, Lyndsie S.; Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.
Translating statistical species-habitat models to interactive decision support tools.
Wszola, Lyndsie S; Simonsen, Victoria L; Stuber, Erica F; Gillespie, Caitlyn R; Messinger, Lindsey N; Decker, Karie L; Lusk, Jeffrey J; Jorgensen, Christopher F; Bishop, Andrew A; Fontaine, Joseph J
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences.
Translating statistical species-habitat models to interactive decision support tools
Simonsen, Victoria L.; Stuber, Erica F.; Gillespie, Caitlyn R.; Messinger, Lindsey N.; Decker, Karie L.; Lusk, Jeffrey J.; Jorgensen, Christopher F.; Bishop, Andrew A.; Fontaine, Joseph J.
2017-01-01
Understanding species-habitat relationships is vital to successful conservation, but the tools used to communicate species-habitat relationships are often poorly suited to the information needs of conservation practitioners. Here we present a novel method for translating a statistical species-habitat model, a regression analysis relating ring-necked pheasant abundance to landcover, into an interactive online tool. The Pheasant Habitat Simulator combines the analytical power of the R programming environment with the user-friendly Shiny web interface to create an online platform in which wildlife professionals can explore the effects of variation in local landcover on relative pheasant habitat suitability within spatial scales relevant to individual wildlife managers. Our tool allows users to virtually manipulate the landcover composition of a simulated space to explore how changes in landcover may affect pheasant relative habitat suitability, and guides users through the economic tradeoffs of landscape changes. We offer suggestions for development of similar interactive applications and demonstrate their potential as innovative science delivery tools for diverse professional and public audiences. PMID:29236707
Bio-TDS: bioscience query tool discovery system.
Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M
2017-01-04
Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T
2017-05-01
Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.
NASA Technical Reports Server (NTRS)
Woronowicz, Michael; Abel, Joshua; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin;
2014-01-01
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations ("directionality"). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb-mass/yr. to about 1 lb-mass/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.
NASA Technical Reports Server (NTRS)
Woronowicz, Michael S.; Abel, Joshua C.; Autrey, David; Blackmon, Rebecca; Bond, Tim; Brown, Martin; Buffington, Jesse; Cheng, Edward; DeLatte, Danielle; Garcia, Kelvin;
2014-01-01
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to detect NH3 coolant leaks in the ISS thermal control system.An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performance to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (directionality).The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lbmyr. to about 1 lbmday. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ramwake flows and structural shadowing within low Earth orbit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O
Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less
Manipulability, force, and compliance analysis for planar continuum manipulators
NASA Technical Reports Server (NTRS)
Gravagne, Ian A.; Walker, Ian D.
2002-01-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Manipulability, force, and compliance analysis for planar continuum manipulators.
Gravagne, Ian A; Walker, Ian D
2002-06-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.
Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D
2016-02-01
Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Verification of out-of-control situations detected by "average of normal" approach.
Liu, Jiakai; Tan, Chin Hon; Loh, Tze Ping; Badrick, Tony
2016-11-01
"Average of normal" (AoN) or "moving average" is increasingly used as an adjunct quality control tool in laboratory practice. Little guidance exists on how to verify if an out-of-control situation in the AoN chart is due to a shift in analytical performance, or underlying patient characteristics. Through simulation based on clinical data, we examined 1) the location of the last apparently stable period in the AoN control chart after an analytical shift, and 2) an approach to verify if the observed shift is related to an analytical shift by repeat testing of archived patient samples from the stable period for 21 common analytes. The number of blocks of results to look back for the stable period increased with the duration of the analytical shift, and was larger when smaller AoN block sizes were used. To verify an analytical shift, 3 archived samples from the analytically stable period should be retested. In particular, the process is deemed to have shifted if a difference of >2 analytical standard deviations (i.e. 1:2s rejection rule) between the original and retested results are observed in any of the 3 samples produced. The probability of Type-1 error (i.e., false rejection) and power (i.e., detecting true analytical shift) of this rule are <0.1 and >0.9, respectively. The use of appropriately archived patient samples to verify an apparent analytical shift is preferred to quality control materials. Nonetheless, the above findings may also apply to quality control materials, barring matrix effects. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Data and Tools | Concentrating Solar Power | NREL
download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and
ERIC Educational Resources Information Center
Zupanc, Darko; Urank, Matjaz; Bren, Matevz
2009-01-01
From 1995, data on students' achievement in schools (i.e., teacher's grades) and all data on achievement in the 5-subject group certificate--the "Matura" exam--have been systematically gathered for the entire yearly cohort of students in upper secondary education in Slovenia. This paper describes an on-line data selection system and data…
Using Streaming Analytics for Effective Real Time Network Visibility -
problem before it became a big issue. That's just one use case as to how you can make use of this sort of , having the ability to see all of your network traffic in one place, single pane of glass, will allow you , malware detection devices, other networks' ability tools. I was never able to see everything in one spot
In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions
NASA Astrophysics Data System (ADS)
Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh
2005-09-01
This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.
Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert
2017-07-01
The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.
Financial Forecasting and Stochastic Modeling: Predicting the Impact of Business Decisions.
Rubin, Geoffrey D; Patel, Bhavik N
2017-05-01
In health care organizations, effective investment of precious resources is critical to assure that the organization delivers high-quality and sustainable patient care within a supportive environment for patients, their families, and the health care providers. This holds true for organizations independent of size, from small practices to large health systems. For radiologists whose role is to oversee the delivery of imaging services and the interpretation, communication, and curation of imaging-informed information, business decisions influence where and how they practice, the tools available for image acquisition and interpretation, and ultimately their professional satisfaction. With so much at stake, physicians must understand and embrace the methods necessary to develop and interpret robust financial analyses so they effectively participate in and better understand decision making. This review discusses the financial drivers upon which health care organizations base investment decisions and the central role that stochastic financial modeling should play in support of strategically aligned capital investments. Given a health care industry that has been slow to embrace advanced financial analytics, a fundamental message of this review is that the skills and analytical tools are readily attainable and well worth the effort to implement in the interest of informed decision making. © RSNA, 2017 Online supplemental material is available for this article.
New tools for investigating student learning in upper-division electrostatics
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.
Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.
Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael
2009-06-01
Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.
Cyberhubs: Virtual Research Environments for Astronomy
NASA Astrophysics Data System (ADS)
Herwig, Falk; Andrassy, Robert; Annau, Nic; Clarkson, Ondrea; Côté, Benoit; D’Sa, Aaron; Jones, Sam; Moa, Belaid; O’Connell, Jericho; Porter, David; Ritter, Christian; Woodward, Paul
2018-05-01
Collaborations in astronomy and astrophysics are faced with numerous cyber-infrastructure challenges, such as large data sets, the need to combine heterogeneous data sets, and the challenge to effectively collaborate on those large, heterogeneous data sets with significant processing requirements and complex science software tools. The cyberhubs system is an easy-to-deploy package for small- to medium-sized collaborations based on the Jupyter and Docker technology, which allows web-browser-enabled, remote, interactive analytic access to shared data. It offers an initial step to address these challenges. The features and deployment steps of the system are described, as well as the requirements collection through an account of the different approaches to data structuring, handling, and available analytic tools for the NuGrid and PPMstar collaborations. NuGrid is an international collaboration that creates stellar evolution and explosion physics and nucleosynthesis simulation data. The PPMstar collaboration performs large-scale 3D stellar hydrodynamics simulations of interior convection in the late phases of stellar evolution. Examples of science that is currently performed on cyberhubs, in the areas of 3D stellar hydrodynamic simulations, stellar evolution and nucleosynthesis, and Galactic chemical evolution, are presented.
Supporting Communication and Coordination in Collaborative Sensemaking.
Mahyar, Narges; Tory, Melanie
2014-12-01
When people work together to analyze a data set, they need to organize their findings, hypotheses, and evidence, share that information with their collaborators, and coordinate activities amongst team members. Sharing externalizations (recorded information such as notes) could increase awareness and assist with team communication and coordination. However, we currently know little about how to provide tool support for this sort of sharing. We explore how linked common work (LCW) can be employed within a `collaborative thinking space', to facilitate synchronous collaborative sensemaking activities in Visual Analytics (VA). Collaborative thinking spaces provide an environment for analysts to record, organize, share and connect externalizations. Our tool, CLIP, extends earlier thinking spaces by integrating LCW features that reveal relationships between collaborators' findings. We conducted a user study comparing CLIP to a baseline version without LCW. Results demonstrated that LCW significantly improved analytic outcomes at a collaborative intelligence task. Groups using CLIP were also able to more effectively coordinate their work, and held more discussion of their findings and hypotheses. LCW enabled them to maintain awareness of each other's activities and findings and link those findings to their own work, preventing disruptive oral awareness notifications.
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
NASA Technical Reports Server (NTRS)
Kempler, Steve; Mathews, Tiffany
2016-01-01
The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
Physics of cosmological cascades and observable properties
NASA Astrophysics Data System (ADS)
Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.
2017-04-01
TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.
We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...
Analytical Model-Based Design Optimization of a Transverse Flux Machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
Werber, D; Bernard, H
2014-02-27
Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.
Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.
Monteiro, Tiago; Almeida, Maria Gabriela
2018-05-14
Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.
Multivariable Hermite polynomials and phase-space dynamics
NASA Technical Reports Server (NTRS)
Dattoli, G.; Torre, Amalia; Lorenzutta, S.; Maino, G.; Chiccoli, C.
1994-01-01
The phase-space approach to classical and quantum systems demands for advanced analytical tools. Such an approach characterizes the evolution of a physical system through a set of variables, reducing to the canonically conjugate variables in the classical limit. It often happens that phase-space distributions can be written in terms of quadratic forms involving the above quoted variables. A significant analytical tool to treat these problems may come from the generalized many-variables Hermite polynomials, defined on quadratic forms in R(exp n). They form an orthonormal system in many dimensions and seem the natural tool to treat the harmonic oscillator dynamics in phase-space. In this contribution we discuss the properties of these polynomials and present some applications to physical problems.
Bhatt, Vijay Deep; Joshi, Saumya; Becherer, Markus; Lugli, Paolo
2017-01-01
A flexible enzymatic acetylcholinesterase biosensor based on an electrolyte-gated carbon nanotube field effect transistor is demonstrated. The enzyme immobilization is done on a planar gold gate electrode using 3-mercapto propionic acid as the linker molecule. The sensor showed good sensing capability as a sensor for the neurotransmitter acetylcholine, with a sensitivity of 5.7 μA/decade, and demonstrated excellent specificity when tested against interfering analytes present in the body. As the flexible sensor is supposed to suffer mechanical deformations, the endurance of the sensor was measured by putting it under extensive mechanical stress. The enzymatic activity was inhibited by more than 70% when the phosphate-buffered saline (PBS) buffer was spiked with 5 mg/mL malathion (an organophosphate) solution. The biosensor was successfully challenged with tap water and strawberry juice, demonstrating its usefulness as an analytical tool for organophosphate detection. PMID:28524071
Analytical electron microscopy in the study of biological systems.
Johnson, D E
1986-01-01
The AEM is a powerful tool in biological research, capable of providing information simply not available by other means. The use of a field emission STEM for this application can lead to a significant improvement in spatial resolution in most cases now allowed by the quality of the specimen preparation but perhaps ultimately limited by the effects of radiation damage. Increased elemental sensitivity is at least possible in selected cases with electron energy-loss spectrometry, but fundamental aspects of ELS will probably confine its role to that of a limited complement to EDS. The considerable margin for improvement in sensitivity of the basic analytical technique means that the search for technological improvement will continue. Fortunately, however, current technology can also continue to answer important biological questions.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.
Distributed Parameter Analysis of Pressure and Flow Disturbances in Rocket Propellant Feed Systems
NASA Technical Reports Server (NTRS)
Dorsch, Robert G.; Wood, Don J.; Lightner, Charlene
1966-01-01
A digital distributed parameter model for computing the dynamic response of propellant feed systems is formulated. The analytical approach used is an application of the wave-plan method of analyzing unsteady flow. Nonlinear effects are included. The model takes into account locally high compliances at the pump inlet and at the injector dome region. Examples of the calculated transient and steady-state periodic responses of a simple hypothetical propellant feed system to several types of disturbances are presented. Included are flow disturbances originating from longitudinal structural motion, gimbaling, throttling, and combustion-chamber coupling. The analytical method can be employed for analyzing developmental hardware and offers a flexible tool for the calculation of unsteady flow in these systems.
Kumar, B. Vinodh; Mohan, Thuthi
2018-01-01
OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587
Tool to Prioritize Energy Efficiency Investments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farese, P.; Gelman, R.; Hendron, R.
2012-08-01
To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.
Tools for Educational Data Mining: A Review
ERIC Educational Resources Information Center
Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan
2017-01-01
In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.
Predictive Data Tools Find Uses in Schools
ERIC Educational Resources Information Center
Sparks, Sarah D.
2011-01-01
The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…
Qing, Taiping; He, Dinggeng; He, Xiaoxiao; Wang, Kemin; Xu, Fengzhou; Wen, Li; Shangguan, Jingfang; Mao, Zhengui; Lei, Yanli
2016-04-01
Owing to their highly efficient catalytic effects and substrate specificity, the nucleic acid tool enzymes are applied as 'nano-tools' for manipulating different nucleic acid substrates both in the test-tube and in living organisms. In addition to the function as molecular scissors and molecular glue in genetic engineering, the application of nucleic acid tool enzymes in biochemical analysis has also been extensively developed in the past few decades. Used as amplifying labels for biorecognition events, the nucleic acid tool enzymes are mainly applied in nucleic acids amplification sensing, as well as the amplification sensing of biorelated variations of nucleic acids. With the introduction of aptamers, which can bind different target molecules, the nucleic acid tool enzymes-aided signal amplification strategies can also be used to sense non-nucleic targets (e.g., ions, small molecules, proteins, and cells). This review describes and discusses the amplification strategies of nucleic acid tool enzymes-aided biosensors for biochemical analysis applications. Various analytes, including nucleic acids, ions, small molecules, proteins, and cells, are reviewed briefly. This work also addresses the future trends and outlooks for signal amplification in nucleic acid tool enzymes-aided biosensors.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J
2011-11-01
This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique
NASA Astrophysics Data System (ADS)
Goldberg, Florent
1998-09-01
Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.
Problem Solving in a Middle School Robotics Design Classroom
NASA Astrophysics Data System (ADS)
Norton, Stephen J.; McRobbie, Campbell J.; Ginns, Ian S.
2007-07-01
Little research has been conducted on how students work when they are required to plan, build and evaluate artefacts in technology rich learning environments such as those supported by tools including flow charts, Labview programming and Lego construction. In this study, activity theory was used as an analytic tool to examine the social construction of meaning. There was a focus on the effect of teachers’ goals and the rules they enacted upon student use of the flow chart planning tool, and the tools of the programming language Labview and Lego construction. It was found that the articulation of a teacher’s goals via rules and divisions of labour helped to form distinct communities of learning and influenced the development of different problem solving strategies. The use of the planning tool flow charting was associated with continuity of approach, integration of problem solutions including appreciation of the nexus between construction and programming, and greater educational transformation. Students who flow charted defined problems in a more holistic way and demonstrated more methodical, insightful and integrated approaches to their use of tools. The findings have implications for teaching in design dominated learning environments.
Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig
2018-05-17
Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice
NASA Astrophysics Data System (ADS)
Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.
2013-10-01
Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d
NASA Astrophysics Data System (ADS)
Chen, Jui-Sheng; Li, Loretta Y.; Lai, Keng-Hsin; Liang, Ching-Ping
2017-11-01
A novel solution method is presented which leads to an analytical model for the advective-dispersive transport in a semi-infinite domain involving a wide spectrum of boundary inputs, initial distributions, and zero-order productions. The novel solution method applies the Laplace transform in combination with the generalized integral transform technique (GITT) to obtain the generalized analytical solution. Based on this generalized analytical expression, we derive a comprehensive set of special-case solutions for some time-dependent boundary distributions and zero-order productions, described by the Dirac delta, constant, Heaviside, exponentially-decaying, or periodically sinusoidal functions as well as some position-dependent initial conditions and zero-order productions specified by the Dirac delta, constant, Heaviside, or exponentially-decaying functions. The developed solutions are tested against an analytical solution from the literature. The excellent agreement between the analytical solutions confirms that the new model can serve as an effective tool for investigating transport behaviors under different scenarios. Several examples of applications, are given to explore transport behaviors which are rarely noted in the literature. The results show that the concentration waves resulting from the periodically sinusoidal input are sensitive to dispersion coefficient. The implication of this new finding is that a tracer test with a periodic input may provide additional information when for identifying the dispersion coefficients. Moreover, the solution strategy presented in this study can be extended to derive analytical models for handling more complicated problems of solute transport in multi-dimensional media subjected to sequential decay chain reactions, for which analytical solutions are not currently available.
Constraint-Referenced Analytics of Algebra Learning
ERIC Educational Resources Information Center
Sutherland, Scot M.; White, Tobin F.
2016-01-01
The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…
Towards an Analytic Foundation for Network Architecture
2010-12-31
SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
Johner, S A; Boeing, H; Thamm, M; Remer, T
2015-12-01
The assessment of urinary excretion of specific nutrients (e.g. iodine, sodium) is frequently used to monitor a population's nutrient status. However, when only spot urines are available, always a risk of hydration-status-dependent dilution effects and related misinterpretations exists. The aim of the present study was to establish mean values of 24-h creatinine excretion widely applicable for an appropriate estimation of 24-h excretion rates of analytes from spot urines in adults. Twenty-four-hour creatinine excretion from the formerly representative cross-sectional German VERA Study (n=1463, 20-79 years old) was analysed. Linear regression analysis was performed to identify the most important influencing factors of creatinine excretion. In a subsample of the German DONALD Study (n=176, 20-29 years old), the applicability of the 24-h creatinine excretion values of VERA for the estimation of 24-h sodium and iodine excretion from urinary concentration measurements was tested. In the VERA Study, mean 24-h creatinine excretion was 15.4 mmol per day in men and 11.1 mmol per day in women, significantly dependent on sex, age, body weight and body mass index. Based on the established 24-h creatinine excretion values, mean 24-h iodine and sodium excretions could be estimated from respective analyte/creatinine concentrations, with average deviations <10% compared with the actual 24-h means. The present mean values of 24-h creatinine excretion are suggested as a useful tool to derive realistic hydration-status-independent average 24-h excretion rates from urinary analyte/creatinine ratios. We propose to apply these creatinine reference means routinely in biomarker-based studies aiming at characterizing the nutrient or metabolite status of adult populations by simply measuring metabolite/creatinine ratios in spot urines.
Initiating an Online Reputation Monitoring System with Open Source Analytics Tools
NASA Astrophysics Data System (ADS)
Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd
2018-05-01
Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.
Mechelke, Matthias; Herlet, Jonathan; Benz, J Philipp; Schwarz, Wolfgang H; Zverlov, Vladimir V; Liebl, Wolfgang; Kornberger, Petra
2017-12-01
The rising importance of accurately detecting oligosaccharides in biomass hydrolyzates or as ingredients in food, such as in beverages and infant milk products, demands for the availability of tools to sensitively analyze the broad range of available oligosaccharides. Over the last decades, HPAEC-PAD has been developed into one of the major technologies for this task and represents a popular alternative to state-of-the-art LC-MS oligosaccharide analysis. This work presents the first comprehensive study which gives an overview of the separation of 38 analytes as well as enzymatic hydrolyzates of six different polysaccharides focusing on oligosaccharides. The high sensitivity of the PAD comes at cost of its stability due to recession of the gold electrode. By an in-depth analysis of the sensitivity drop over time for 35 analytes, including xylo- (XOS), arabinoxylo- (AXOS), laminari- (LOS), manno- (MOS), glucomanno- (GMOS), and cellooligosaccharides (COS), we developed an analyte-specific one-phase decay model for this effect over time. Using this model resulted in significantly improved data normalization when using an internal standard. Our results thereby allow a quantification approach which takes the inevitable and analyte-specific PAD response drop into account. Graphical abstract HPAEC-PAD analysis of oligosaccharides and determination of PAD response drop leading to an improved data normalization.
Updates in metabolomics tools and resources: 2014-2015.
Misra, Biswapriya B; van der Hooft, Justin J J
2016-01-01
Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mössbauer spectroscopy in biomedical physics research
NASA Astrophysics Data System (ADS)
Zhang, Xiufang
1994-12-01
Several applications of Mössbauer spectroscopy (MS) as an analytical tool in research on biomedical physics are reviewed: (1) The evaluation of treatments for some diseases such as thalassemia, iron-overload disease, high altitude polycythemia. (2) Medical research on the effects of environmental factors on the human body, for example, the effects of electromagnetic radiation on human red blood cells (RBCs). Some advantages and weaknesses of MS, a new application of the Mössbauer effect, cancer therapy, and some possible applications such as monitoring the RBCs of the patients before, during, and after surgical operation, are discussed.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
Della Pelle, Flavio; Compagnone, Dario
2018-02-04
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.
2018-01-01
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
Failure Modes and Effects Analysis (FMEA): A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.
Stream Lifetimes Against Planetary Encounters
NASA Technical Reports Server (NTRS)
Valsecchi, G. B.; Lega, E.; Froeschle, Cl.
2011-01-01
We study, both analytically and numerically, the perturbation induced by an encounter with a planet on a meteoroid stream. Our analytical tool is the extension of pik s theory of close encounters, that we apply to streams described by geocentric variables. The resulting formulae are used to compute the rate at which a stream is dispersed by planetary encounters into the sporadic background. We have verified the accuracy of the analytical model using a numerical test.
Comparative analytics of infusion pump data across multiple hospital systems.
Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith
2015-02-15
A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Freak oscillation in a dusty plasma.
Zhang, Heng; Yang, Yang; Hong, Xue-Ren; Qi, Xin; Duan, Wen-Shan; Yang, Lei
2017-05-01
The freak oscillation in one-dimensional dusty plasma is studied numerically by particle-in-cell method. Using a perturbation method, the basic set of fluid equations is reduced to a nonlinear Schrödinger equation (NLSE). The rational solution of the NLSE is presented, which is proposed as an effective tool for studying the rogue waves in dusty plasma. Additionally, the application scope of the analytical solution of the rogue wave described by the NLSE is given.
2012-03-01
Simulation Simulation is a flexible tool for modeling airport operations , which has made the method a staple for airport systems analysts. Animation...be derived to define the character- istics of the airport terminal and describe the nature of the systems [sic] operation ”, which makes discrete...This system decomposition method, however, disregards the effects of network structure on performance measures. Real-life processes do not operate
NASA Technical Reports Server (NTRS)
Barber, Peter W.; Demerdash, Nabeel A. O.; Hurysz, B.; Luo, Z.; Denny, Hugh W.; Millard, David P.; Herkert, R.; Wang, R.
1992-01-01
The goal of this research project was to analyze the potential effects of electromagnetic interference (EMI) originating from power system processing and transmission components for Space Station Freedom. The approach consists of four steps: (1) developing analytical tools (models and computer programs); (2) conducting parameterization (what if?) studies; (3) predicting the global space station EMI environment; and (4) providing a basis for modification of EMI standards.
Planning effectiveness may grow on fault trees.
Chow, C W; Haddad, K; Mannino, B
1991-10-01
The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.
Quantitative characterization of edge enhancement in phase contrast x-ray imaging.
Monnin, P; Bulling, S; Hoszowska, J; Valley, J F; Meuli, R; Verdun, F R
2004-06-01
The aim of this study was to model the edge enhancement effect in in-line holography phase contrast imaging. A simple analytical approach was used to quantify refraction and interference contrasts in terms of beam energy and imaging geometry. The model was applied to predict the peak intensity and frequency of the edge enhancement for images of cylindrical fibers. The calculations were compared with measurements, and the relationship between the spatial resolution of the detector and the amplitude of the phase contrast signal was investigated. Calculations using the analytical model were in good agreement with experimental results for nylon, aluminum and copper wires of 50 to 240 microm diameter, and with numerical simulations based on Fresnel-Kirchhoff theory. A relationship between the defocusing distance and the pixel size of the image detector was established. This analytical model is a useful tool for optimizing imaging parameters in phase contrast in-line holography, including defocusing distance, detector resolution and beam energy.
NASA Astrophysics Data System (ADS)
Qin, Ting; Liao, Congwei; Huang, Shengxiang; Yu, Tianbao; Deng, Lianwen
2018-01-01
An analytical drain current model based on the surface potential is proposed for amorphous indium gallium zinc oxide (a-InGaZnO) thin-film transistors (TFTs) with a synchronized symmetric dual-gate (DG) structure. Solving the electric field, surface potential (φS), and central potential (φ0) of the InGaZnO film using the Poisson equation with the Gaussian method and Lambert function is demonstrated in detail. The compact analytical model of current-voltage behavior, which consists of drift and diffusion components, is investigated by regional integration, and voltage-dependent effective mobility is taken into account. Comparison results demonstrate that the calculation results obtained using the derived models match well with the simulation results obtained using a technology computer-aided design (TCAD) tool. Furthermore, the proposed model is incorporated into SPICE simulations using Verilog-A to verify the feasibility of using DG InGaZnO TFTs for high-performance circuit designs.
NASA Astrophysics Data System (ADS)
Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.
2016-04-01
The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.
Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S
2016-04-15
The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Mukhopadhyay, A. K.
1975-01-01
Linear frequency domain methods are inadequate in analyzing the 1975 Viking Orbiter (VO75) digital tape recorder servo due to dominant nonlinear effects such as servo signal limiting, unidirectional servo control, and static/dynamic Coulomb friction. The frequency loop (speed control) servo of the VO75 tape recorder is used to illustrate the analytical tools and methodology of system redundancy elimination and high order transfer function verification. The paper compares time-domain performance parameters derived from a series of nonlinear time responses with the available experimental data in order to select the best possible analytical transfer function representation of the tape transport (mechanical segment of the tape recorder) from several possible candidates. The study also shows how an analytical time-response simulation taking into account most system nonlinearities can pinpoint system redundancy and overdesign stemming from a strictly empirical design approach. System order reduction is achieved through truncation of individual transfer functions and elimination of redundant blocks.
Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...
ERIC Educational Resources Information Center
Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew
2015-01-01
Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
A thermal biosensor based on enzyme reaction.
Zheng, Yi-Hua; Hua, Tse-Chao; Xu, Fei
2005-01-01
Application of the thermal biosensor as analytical tool is promising due to advantages as universal, simplicity and quick response. A novel thermal biosensor based on enzyme reaction has been developed. This biosensor is a flow injection analysis system and consists of two channels with enzyme reaction column and reference column. The reference column, which is set for eliminating the unspecific heat, is inactived on special enzyme reaction of the ingredient to be detected. The special enzyme reaction takes places in the enzyme reaction column at a constant temperature realizing by a thermoelectric thermostat. Thermal sensor based on the thermoelectric module containing 127 serial BiTe-thermocouples is used to monitor the temperature difference between two streams from the enzyme reaction column and the reference column. The analytical example for dichlorvos shows that this biosensor can be used as analytical tool in medicine and biology.
Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo
Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.
2015-01-01
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028
Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A
2015-02-24
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.
Using Google Analytics to evaluate the impact of the CyberTraining project.
McGuckin, Conor; Crowley, Niall
2012-11-01
A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.
NASA Astrophysics Data System (ADS)
Kaur, Parneet; Singh, Sukhwinder; Garg, Sushil; Harmanpreet
2010-11-01
In this paper we study about classification algorithms for farm DSS. By applying classification algorithms i.e. Limited search, ID3, CHAID, C4.5, Improved C4.5 and One VS all Decision Tree on common data set of crop with specified class, results are obtained. The tool used to derive results is SPINA. The graphical results obtained from tool are compared to suggest best technique to develop farm Decision Support System. This analysis would help to researchers to design effective and fast DSS for farmer to take decision for enhancing their yield.
Structural considerations for fabrication and mounting of the AXAF HRMA optics
NASA Technical Reports Server (NTRS)
Cohen, Lester M.; Cernoch, Larry; Mathews, Gary; Stallcup, Michael
1990-01-01
A methodology is described which minimizes optics distortion in the fabrication, metrology, and launch configuration phases. The significance of finite element modeling and breadboard testing is described with respect to performance analyses of support structures and material effects in NASA's AXAF X-ray optics. The paper outlines the requirements for AXAF performance, optical fabrication, metrology, and glass support fixtures, as well as the specifications for mirror sensitivity and the high-resolution mirror assembly. Analytical modeling of the tools is shown to coincide with grinding and polishing experiments, and is useful for designing large-area polishing and grinding tools. Metrological subcomponents that have undergone initial testing show evidence of meeting force requirements.
Self-actuating and self-diagnosing plastically deforming piezo-composite flapping wing MAV
NASA Astrophysics Data System (ADS)
Harish, Ajay B.; Harursampath, Dineshkumar; Mahapatra, D. Roy
2011-04-01
In this work, we propose a constitutive model to describe the behavior of Piezoelectric Fiber Reinforced Composite (PFRC) material consisting of elasto-plastic matrix reinforced by strong elastic piezoelectric fibers. Computational efficiency is achieved using analytical solutions for elastic stifness matrix derived from Variational Asymptotic Methods (VAM). This is extended to provide Structural Health Monitoring (SHM) based on plasticity induced degradation of flapping frequency of PFRC. Overall this work provides an effective mathematical tool that can be used for structural self-health monitoring of plasticity induced flapping degradation of PFRC flapping wing MAVs. The developed tool can be re-calibrated to also provide SHM for other forms of failures like fatigue, matrix cracking etc.
Emergent FDA biodefense issues for microarray technology: process analytical technology.
Weinberg, Sandy
2004-11-01
A successful biodefense strategy relies upon any combination of four approaches. A nation can protect its troops and citizenry first by advanced mass vaccination, second, by responsive ring vaccination, and third, by post-exposure therapeutic treatment (including vaccine therapies). Finally, protection can be achieved by rapid detection followed by exposure limitation (suites and air filters) or immediate treatment (e.g., antibiotics, rapid vaccines and iodine pills). All of these strategies rely upon or are enhanced by microarray technologies. Microarrays can be used to screen, engineer and test vaccines. They are also used to construct early detection tools. While effective biodefense utilizes a variety of tactical tools, microarray technology is a valuable arrow in that quiver.
Current and future bioanalytical approaches for stroke assessment.
Pullagurla, Swathi R; Baird, Alison E; Adamski, Mateusz G; Soper, Steven A
2015-01-01
Efforts are underway to develop novel platforms for stroke diagnosis to meet the criteria for effective treatment within the narrow time window mandated by the FDA-approved therapeutic (<3 h). Blood-based biomarkers could be used for rapid stroke diagnosis and coupled with new analytical tools, could serve as an attractive platform for managing stroke-related diseases. In this review, we will discuss the physiological processes associated with stroke and current diagnostic tools as well as their associated shortcomings. We will then review information on blood-based biomarkers and various detection technologies. In particular, point of care testing that permits small blood volumes required for the analysis and rapid turn-around time measurements of multiple markers will be presented.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2017-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2015-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
INVESTIGATION OF CE/LIF AS A TOOL IN THE ...
The investigation of emerging contaminant issues is a proactive effort in environmental analysis. As a part of this effort, sewage effluent is of current analytical interest because of the presence of pharmaceuticals and their metabolites and personal care products The environmental impact of these components is still under investigation but their constant perfusion into receiving waters and their potential effect on biota is of concern. This paper examines a tool for the characterization of sewage effluent using capillary electrophoresis/laser-induced fluorescence (CE/LIF) with a frequency-doubled laser operated in the ultraviolet (UV). Fluorescent acidic analytes are targeted because they present special problems for techniques such as gas chromatography/mass spectrometry (GC/MS) but are readily accessible to CE/LIF. As an example of the application of this tool, salicylic acid is determined near the 100 ng/L level in sewage effluent. Salicylic acid is a metabolite of various analgesics Relatively stable in the environment, it is a common contaminant of municipal sewage systems. Salicylic acid was recovered from freshly collected samples of the effluent by liquid-liquid extraction as part of a broad characterization effort. Confirmation of identity was by electron ionization GC/MS after conversion of the salicylic acid to the methyl ester by means of trimethylsilyidiazomethane CE/LIF in the UV has revealed more than 50 individual peaks in the extract and a bac
Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique
NASA Technical Reports Server (NTRS)
Maise, G.; Rossi, M. J.
1974-01-01
A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.
Process monitoring and visualization solutions for hot-melt extrusion: a review.
Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas
2014-02-01
Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.
METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH
Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...
Analytical tools for identifying bicycle route suitability, coverage, and continuity.
DOT National Transportation Integrated Search
2012-05-01
This report presents new tools created to assess bicycle suitability using geographic information systems (GIS). Bicycle suitability is a rating of how appropriate a roadway is for bicycle travel based on attributes of the roadway, such as vehi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Pullum, Laura L; Steed, Chad A
In this position paper, we describe the design and implementation of the Oak Ridge Bio-surveillance Toolkit (ORBiT): a collection of novel statistical and machine learning tools implemented for (1) integrating heterogeneous traditional (e.g. emergency room visits, prescription sales data, etc.) and non-traditional (social media such as Twitter and Instagram) data sources, (2) analyzing large-scale datasets and (3) presenting the results from the analytics as a visual interface for the end-user to interact and provide feedback. We present examples of how ORBiT can be used to summarize ex- tremely large-scale datasets effectively and how user interactions can translate into the datamore » analytics process for bio-surveillance. We also present a strategy to estimate parameters relevant to dis- ease spread models from near real time data feeds and show how these estimates can be integrated with disease spread models for large-scale populations. We conclude with a perspective on how integrating data and visual analytics could lead to better forecasting and prediction of disease spread as well as improved awareness of disease susceptible regions.« less
Hsu, Yen-Michael S; Burnham, Carey-Ann D
2014-06-01
Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a tool for identifying clinically relevant anaerobes. We evaluated the analytical performance characteristics of the Bruker Microflex with Biotyper 3.0 software system for identification of anaerobes and examined the impact of direct formic acid (FA) treatment and other pre-analytical factors on MALDI-TOF MS performance. A collection of 101 anaerobic bacteria were evaluated, including Clostridium spp., Propionibacterium spp., Fusobacterium spp., Bacteroides spp., and other anaerobic bacterial of clinical relevance. The results of our study indicate that an on-target extraction with 100% FA improves the rate of accurate identification without introducing misidentification (P<0.05). In addition, we modify the reporting cutoffs for the Biotyper "score" yielding acceptable identification. We found that a score of ≥1.700 can maximize the rate of identification. Of interest, MALDI-TOF MS can correctly identify anaerobes grown in suboptimal conditions, such as on selective culture media and following oxygen exposure. In conclusion, we report on a number of simple and cost-effective pre- and post-analytical modifications could enhance MALDI-TOF MS identification for anaerobic bacteria. Copyright © 2014 Elsevier Inc. All rights reserved.
Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C
2013-06-01
Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).
Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter
2017-01-01
We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454
Sigma Metrics Across the Total Testing Process.
Charuruks, Navapun
2017-03-01
Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Wu, Yubao; Zhu, Xiaofeng; Chen, Jian; Zhang, Xiang
2013-11-01
Epistasis (gene-gene interaction) detection in large-scale genetic association studies has recently drawn extensive research interests as many complex traits are likely caused by the joint effect of multiple genetic factors. The large number of possible interactions poses both statistical and computational challenges. A variety of approaches have been developed to address the analytical challenges in epistatic interaction detection. These methods usually output the identified genetic interactions and store them in flat file formats. It is highly desirable to develop an effective visualization tool to further investigate the detected interactions and unravel hidden interaction patterns. We have developed EINVis, a novel visualization tool that is specifically designed to analyze and explore genetic interactions. EINVis displays interactions among genetic markers as a network. It utilizes a circular layout (specially, a tree ring view) to simultaneously visualize the hierarchical interactions between single nucleotide polymorphisms (SNPs), genes, and chromosomes, and the network structure formed by these interactions. Using EINVis, the user can distinguish marginal effects from interactions, track interactions involving more than two markers, visualize interactions at different levels, and detect proxy SNPs based on linkage disequilibrium. EINVis is an effective and user-friendly free visualization tool for analyzing and exploring genetic interactions. It is publicly available with detailed documentation and online tutorial on the web at http://filer.case.edu/yxw407/einvis/. © 2013 WILEY PERIODICALS, INC.
Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen
2015-11-10
An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woronowicz, Michael; Blackmon, Rebecca; Brown, Martin
2014-12-09
The International Space Station program is developing a robotically-operated leak locator tool to be used externally. The tool would consist of a Residual Gas Analyzer for partial pressure measurements and a full range pressure gauge for total pressure measurements. The primary application is to demonstrate the ability to detect NH{sub 3} coolant leaks in the ISS thermal control system. An analytical model of leak plume physics is presented that can account for effusive flow as well as plumes produced by sonic orifices and thruster operations. This model is used along with knowledge of typical RGA and full range gauge performancemore » to analyze the expected instrument sensitivity to ISS leaks of various sizes and relative locations (“directionality”). The paper also presents experimental results of leak simulation testing in a large thermal vacuum chamber at NASA Goddard Space Flight Center. This test characterized instrument sensitivity as a function of leak rates ranging from 1 lb{sub m/}/yr. to about 1 lb{sub m}/day. This data may represent the first measurements collected by an RGA or ion gauge system monitoring off-axis point sources as a function of location and orientation. Test results are compared to the analytical model and used to propose strategies for on-orbit leak location and environment characterization using the proposed instrument while taking into account local ISS conditions and the effects of ram/wake flows and structural shadowing within low Earth orbit.« less
Optical weak measurements without removing the Goos-Hänchen phase
NASA Astrophysics Data System (ADS)
Araújo, Manoel P.; De Leo, Stefano; Maia, Gabriel G.
2018-04-01
Optical weak measurements are a powerful tool for measuring small shifts of optical paths. When applied to the measurement of the Goos-Hänchen shift, in particular, a special step must be added to its protocol: the removal of the relative Goos-Hänchen phase, since its presence generates a destructive influence on the measurement. There is, however, a lack of description in the literature of the precise effect of the Goos-Hänchen phase on weak measurements. In this paper we address this issue, developing an analytic study for a Gaussian beam transmitted through a dielectric structure. We obtain analytic expressions for weak measurements as a function of the relative Goos-Hänchen phase and show how to remove it without the aid of waveplates.
NASA Astrophysics Data System (ADS)
Seo, Joo-Young; Park, Soo-Keun; Kwon, Hoon; Cho, Ki-Sub
2017-10-01
The mechanical properties of ultra-high-strength secondary hardened stainless steels with varying Co, V, and C contents have been studied. A reduced-Co alloy based on the chemical composition of Ferrium S53 was made by increasing the V and C content. This changed the M2C-strengthened microstructure to a MC plus M2C-strengthened microstructure, and no deteriorative effects were observed for peak-aged and over-aged samples despite the large reduction in Co content from 14 to 7 wt pct. The mechanical properties according to alloying modification were associated with carbide precipitation kinetics, which was clearly outlined by combining analytical tools including small-angle neutron scattering (SANS) as well as an analytical TEM with computational simulation.
Patent databases and analytical tools for space technology commercialization (Part 2)
NASA Astrophysics Data System (ADS)
Hulsey, William N., III
2002-07-01
A shift in the space industry has occurred that requires technology developers to understand the basics of the intellectual property laws; Global harmonization facilitates this understanding; internet-based tools enable knowledge of these rights and the facts affecting them.
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
Integration of bus stop counts data with census data for improving bus service.
DOT National Transportation Integrated Search
2016-04-01
This research project produced an open source transit market data visualization and analysis tool suite, : The Bus Transit Market Analyst (BTMA), which contains user-friendly GIS mapping and data : analytics tools, and state-of-the-art transit demand...
NASA Astrophysics Data System (ADS)
Oshagh, M.; Boisse, I.; Boué, G.; Montalto, M.; Santos, N. C.; Bonfils, X.; Haghighipour, N.
2013-01-01
We present an improved version of SOAP named "SOAP-T", which can generate the radial velocity variations and light curves for systems consisting of a rotating spotted star with a transiting planet. This tool can be used to study the anomalies inside transit light curves and the Rossiter-McLaughlin effect, to better constrain the orbital configuration and properties of planetary systems and the active zones of their host stars. Tests of the code are presented to illustrate its performance and to validate its capability when compared with analytical models and real data. Finally, we apply SOAP-T to the active star, HAT-P-11, observed by the NASA Kepler space telescope and use this system to discuss the capability of this tool in analyzing light curves for the cases where the transiting planet overlaps with the star's spots. The tool's public interface is available at http://www.astro.up.pt/resources/soap-t/
Inducer analysis/pump model development
NASA Astrophysics Data System (ADS)
Cheng, Gary C.
1994-03-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Inducer analysis/pump model development
NASA Technical Reports Server (NTRS)
Cheng, Gary C.
1994-01-01
Current design of high performance turbopumps for rocket engines requires effective and robust analytical tools to provide design information in a productive manner. The main goal of this study was to develop a robust and effective computational fluid dynamics (CFD) pump model for general turbopump design and analysis applications. A finite difference Navier-Stokes flow solver, FDNS, which includes an extended k-epsilon turbulence model and appropriate moving zonal interface boundary conditions, was developed to analyze turbulent flows in turbomachinery devices. In the present study, three key components of the turbopump, the inducer, impeller, and diffuser, were investigated by the proposed pump model, and the numerical results were benchmarked by the experimental data provided by Rocketdyne. For the numerical calculation of inducer flows with tip clearance, the turbulence model and grid spacing are very important. Meanwhile, the development of the cross-stream secondary flow, generated by curved blade passage and the flow through tip leakage, has a strong effect on the inducer flow. Hence, the prediction of the inducer performance critically depends on whether the numerical scheme of the pump model can simulate the secondary flow pattern accurately or not. The impeller and diffuser, however, are dominated by pressure-driven flows such that the effects of turbulence model and grid spacing (except near leading and trailing edges of blades) are less sensitive. The present CFD pump model has been proved to be an efficient and robust analytical tool for pump design due to its very compact numerical structure (requiring small memory), fast turnaround computing time, and versatility for different geometries.
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
Making Sense of Game-Based User Data: Learning Analytics in Applied Games
ERIC Educational Resources Information Center
Steiner, Christina M.; Kickmeier-Rus, Michael D.; Albert, Dietrich
2015-01-01
Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Errichello, Robert
2013-08-29
An analytical model is developed to evaluate the design of a spline coupling. For a given torque and shaft misalignment, the model calculates the number of teeth in contact, tooth loads, stiffnesses, stresses, and safety factors. The analytic model provides essential spline coupling design and modeling information and could be easily integrated into gearbox design and simulation tools.
Challenges of Using Learning Analytics Techniques to Support Mobile Learning
ERIC Educational Resources Information Center
Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide
2015-01-01
Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…
A Simpli ed, General Approach to Simulating from Multivariate Copula Functions
Barry Goodwin
2012-01-01
Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...
Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...
Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.
Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi
2017-05-01
Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta
2013-01-01
Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…
Survey of Network Visualization Tools
2007-12-01
Dimensionality • 2D Comments: Deployment Type: • Components for tool building • Standalone Tool OS: • Windows Extensibility • ActiveX ...Visual Basic Comments: Interoperability Daisy is fully compliant with Microsoft’s ActiveX , therefore, other Windows based programs can...other functions that improve analytic decision making. Available in ActiveX , C++, Java, and .NET editions. • Tom Sawyer Visualization: Enables you to
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.
Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif
2014-12-01
The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei
2017-02-01
An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett-Burman (P-B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer’s desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4-113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables.
Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei
2017-01-01
An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett–Burman (P–B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer’s desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4–113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables. PMID:28225030
Li, Minmin; Dai, Chao; Wang, Fengzhong; Kong, Zhiqiang; He, Yan; Huang, Ya Tao; Fan, Bei
2017-02-22
An effective analysis method was developed based on a chemometric tool for the simultaneous quantification of five different post-harvest pesticides (2,4-dichlorophenoxyacetic acid (2,4-D), carbendazim, thiabendazole, iprodione, and prochloraz) in fruits and vegetables. In the modified QuEChERS (quick, easy, cheap, effective, rugged and safe) method, the factors and responses for optimization of the extraction and cleanup analyses were compared using the Plackett-Burman (P-B) screening design. Furthermore, the significant factors (toluene percentage, hydrochloric acid (HCl) percentage, and graphitized carbon black (GCB) amount) were optimized using a central composite design (CCD) combined with Derringer's desirability function (DF). The limits of quantification (LOQs) were estimated to be 1.0 μg/kg for 2,4-D, carbendazim, thiabendazole, and prochloraz, and 1.5 μg/kg for iprodione in food matrices. The mean recoveries were in the range of 70.4-113.9% with relative standard deviations (RSDs) of less than 16.9% at three spiking levels. The measurement uncertainty of the analytical method was determined using the bottom-up approach, which yielded an average value of 7.6%. Carbendazim was most frequently found in real samples analyzed using the developed method. Consequently, the analytical method can serve as an advantageous and rapid tool for determination of five preservative pesticides in fruits and vegetables.
Prabhu, Gurpur Rakesh D; Witek, Henryk A; Urban, Pawel L
2018-05-31
Most analytical methods are based on "analogue" inputs from sensors of light, electric potentials, or currents. The signals obtained by such sensors are processed using certain calibration functions to determine concentrations of the target analytes. The signal readouts are normally done after an optimised and fixed time period, during which an assay mixture is incubated. This minireview covers another-and somewhat unusual-analytical strategy, which relies on the measurement of time interval between the occurrences of two distinguishable states in the assay reaction. These states manifest themselves via abrupt changes in the properties of the assay mixture (e.g. change of colour, appearance or disappearance of luminescence, change in pH, variations in optical activity or mechanical properties). In some cases, a correlation between the time of appearance/disappearance of a given property and the analyte concentration can be also observed. An example of an assay based on time measurement is an oscillating reaction, in which the period of oscillations is linked to the concentration of the target analyte. A number of chemo-chronometric assays, relying on the existing (bio)transformations or artificially designed reactions, were disclosed in the past few years. They are very attractive from the fundamental point of view but-so far-only few of them have be validated and used to address real-world problems. Then, can chemo-chronometric assays become a practical tool for chemical analysis? Is there a need for further development of such assays? We are aiming to answer these questions.
Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2018-04-03
Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.
Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.
Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth
2017-07-01
To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.
ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress
NASA Technical Reports Server (NTRS)
Kempler, Steven
2015-01-01
The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.
NASA Astrophysics Data System (ADS)
Johnson, S. P.; Rohrer, M. E.
2017-12-01
The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.
ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION
Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...
DOT National Transportation Integrated Search
2001-04-01
Air quality has become one of the important factors to be considered in making transportation improvement : decisions. Thus, tools are expected to help such decision-makings. On the other hand, MOBILE5 model, which : has been widely used in evaluatin...
Hanson, Marta
2017-09-01
Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.
The National energy modeling system
NASA Astrophysics Data System (ADS)
The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.
Analytical technique to address terrorist threats by chemical weapons of mass destruction
NASA Astrophysics Data System (ADS)
Dempsey, Patrick M.
1997-01-01
Terrorism is no longer an issue without effect on the American mind. We now live with the same concerns and fears that have been commonplace in other developed and third world countries for a long time. Citizens of other countries have long lived with the specter of terrorism and now the U.S. needs to be concerned and prepared for terrorist activities.T he terrorist has the ability to cause great destructive effects by focusing their effort on unaware and unprepared civilian populations. Attacks can range from simple explosives to sophisticated nuclear, chemical and biological weapons. Intentional chemical releases of hazardous chemicals or chemical warfare agents pose a great threat because of their ready availability and/or ease of production, and their ability to cause widespread damage. As this battlefront changes from defined conflicts and enemies to unnamed terrorists, we must implement the proper analytical tools to provide a fast and efficient response. Each chemical uses in a terrorists weapon leaves behind a chemical signature that can be used to identify the materials involved and possibly lead investigators to the source and to those responsible. New tools to provide fast and accurate detection for battlefield chemical and biological agent attack are emerging. Gas chromatography/mass spectrometry (GC/MS) is one of these tools that has found increasing use by the military to respond to chemical agent attacks. As the technology becomes smaller and more portable, it can be used by law enforcement personnel to identify suspected terrorist releases and to help prepare the response; define contaminated areas for evacuation and safety concerns, identify the proper treatment of exposed or affected civilians, and suggest decontamination and cleanup procedures.
NASA Astrophysics Data System (ADS)
Majumdar, Ankush; Hazra, Tumpa; Dutta, Amit
2017-09-01
This work presents a Multi-criteria Decision Making (MCDM) tool to select a landfill site from three candidate sites proposed for Kolkata Municipal Corporation (KMC) area that complies with accessibility, receptor, environment, public acceptability, geological and economic criteria. Analytical Hierarchy Process has been used to solve the MCDM problem. Suitability of the three sites (viz. Natagachi, Gangajoara and Kharamba) as landfills as proposed by KMC has been checked by Landfill Site Sensitivity Index (LSSI) as well as Economic Viability Index (EVI). Land area availability for disposing huge quantity of Municipal Solid Waste for the design period has been checked. Analysis of the studied sites show that they are moderately suitable for landfill facility construction as both LSSI and EVI scores lay between 300 and 750. The proposed approach represents an effective MCDM tool for siting sanitary landfill in growing metropolitan cities of developing countries like India.
Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive
NASA Technical Reports Server (NTRS)
Geller, Gary N.
2004-01-01
Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.
Cause-and-effect mapping of critical events.
Graves, Krisanne; Simmons, Debora; Galley, Mark D
2010-06-01
Health care errors are routinely reported in the scientific and public press and have become a major concern for most Americans. In learning to identify and analyze errors health care can develop some of the skills of a learning organization, including the concept of systems thinking. Modern experts in improving quality have been working in other high-risk industries since the 1920s making structured organizational changes through various frameworks for quality methods including continuous quality improvement and total quality management. When using these tools, it is important to understand systems thinking and the concept of processes within organization. Within these frameworks of improvement, several tools can be used in the analysis of errors. This article introduces a robust tool with a broad analytical view consistent with systems thinking, called CauseMapping (ThinkReliability, Houston, TX, USA), which can be used to systematically analyze the process and the problem at the same time. Copyright 2010 Elsevier Inc. All rights reserved.
Impact and Penetration Simulations for Composite Wing-like Structures
NASA Technical Reports Server (NTRS)
Knight, Norman F.
1998-01-01
The goal of this research project was to develop methodologies for the analysis of wing-like structures subjected to impact loadings. Low-speed impact causing either no damage or only minimal damage and high-speed impact causing severe laminate damage and possible penetration of the structure were to be considered during this research effort. To address this goal, an assessment of current analytical tools for impact analysis was performed. Assessment of the analytical tools for impact and penetration simulations with regard to accuracy, modeling, and damage modeling was considered as well as robustness, efficient, and usage in a wing design environment. Following a qualitative assessment, selected quantitative evaluations will be performed using the leading simulation tools. Based on this assessment, future research thrusts for impact and penetration simulation of composite wing-like structures were identified.
Shouting in the Jungle - The SETI Transmission Debate
NASA Astrophysics Data System (ADS)
Schuch, H. P.; Almar, I.
The prudence of transmitting deliberate messages from Earth into interstellar space remains controversial. Reasoned risk- benefit analysis is needed, to inform policy recommendations by such bodies as the International Academy of Astronautics SETI Permanent Study Group. As a first step, at the 2005 International Astronautical Congress in Fukuoka, we discussed the San Marino Scale, a new analytical tool for assessing transmission risk. That Scale was updated, and a revised version presented at the 2006 IAC in Valencia. We are now in a position to recommend specific improvements to the scale we proposed for quantifying terrestrial transmissions. Our intent is to make this tool better reflect the detectability and potential impact of recent and proposed messages beamed from Earth. We believe the changes proposed herein strengthen the San Marino Scale as an analytical tool, and bring us closer to its eventual adoption.
Kamel Boulos, Maged N; Sanfilippo, Antonio P; Corley, Courtney D; Wheeler, Steve
2010-10-01
This paper explores Technosocial Predictive Analytics (TPA) and related methods for Web "data mining" where users' posts and queries are garnered from Social Web ("Web 2.0") tools such as blogs, micro-blogging and social networking sites to form coherent representations of real-time health events. The paper includes a brief introduction to commonly used Social Web tools such as mashups and aggregators, and maps their exponential growth as an open architecture of participation for the masses and an emerging way to gain insight about people's collective health status of whole populations. Several health related tool examples are described and demonstrated as practical means through which health professionals might create clear location specific pictures of epidemiological data such as flu outbreaks. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
Analytical and Experimental Investigation of Process Loads on Incremental Severe Plastic Deformation
NASA Astrophysics Data System (ADS)
Okan Görtan, Mehmet
2017-05-01
From the processing point of view, friction is a major problem in the severe plastic deformation (SPD) using equal channel angular pressing (ECAP) process. Incremental ECAP can be used in order to optimize frictional effects during SPD. A new incremental ECAP has been proposed recently. This new process called as equal channel angular swaging (ECAS) combines the conventional ECAP and the incremental bulk metal forming method rotary swaging. ECAS tool system consists of two dies with an angled channel that contains two shear zones. During ECAS process, two forming tool halves, which are concentrically arranged around the workpiece, perform high frequency radial movements with short strokes, while samples are pushed through these. The oscillation direction nearly coincides with the shearing direction in the workpiece. The most important advantages in comparison to conventional ECAP are a significant reduction in the forces in material feeding direction plus the potential to be extended to continuous processing. In the current study, the mechanics of the ECAS process is investigated using slip line field approach. An analytical model is developed to predict process loads. The proposed model is validated using experiments and FE simulations.
A graphical approach to radio frequency quadrupole design
NASA Astrophysics Data System (ADS)
Turemen, G.; Unel, G.; Yasatekin, B.
2015-07-01
The design of a radio frequency quadrupole, an important section of all ion accelerators, and the calculation of its beam dynamics properties can be achieved using the existing computational tools. These programs, originally designed in 1980s, show effects of aging in their user interfaces and in their output. The authors believe there is room for improvement in both design techniques using a graphical approach and in the amount of analytical calculations before going into CPU burning finite element analysis techniques. Additionally an emphasis on the graphical method of controlling the evolution of the relevant parameters using the drag-to-change paradigm is bound to be beneficial to the designer. A computer code, named DEMIRCI, has been written in C++ to demonstrate these ideas. This tool has been used in the design of Turkish Atomic Energy Authority (TAEK)'s 1.5 MeV proton beamline at Saraykoy Nuclear Research and Training Center (SANAEM). DEMIRCI starts with a simple analytical model, calculates the RFQ behavior and produces 3D design files that can be fed to a milling machine. The paper discusses the experience gained during design process of SANAEM Project Prometheus (SPP) RFQ and underlines some of DEMIRCI's capabilities.
Analytical tools and isolation of TOF events
NASA Technical Reports Server (NTRS)
Wolf, H.
1974-01-01
Analytical tools are presented in two reports. The first is a probability analysis of the orbital distribution of events in relation to dust flux density observed in Pioneer 8 and 9 distributions. A distinction is drawn between asymmetries caused by random fluctuations and systematic variations, by calculating the probability of any particular asymmetry. The second article discusses particle trajectories for a repulsive force field. The force on a particle due to solar radiation pressure is directed along the particle's radius vector, from the sun, and is inversely proportional to its distance from the sun. Equations of motion which describe both solar radiation pressure and gravitational attraction are presented.
RNA "traffic lights": an analytical tool to monitor siRNA integrity.
Holzhauser, Carolin; Liebl, Renate; Goepferich, Achim; Wagenknecht, Hans-Achim; Breunig, Miriam
2013-05-17
The combination of thiazole orange and thiazole red as an internal energy transfer-based fluorophore pair in oligonucleotides provides an outstanding analytical tool to follow DNA/RNA hybridization through a distinct fluorescence color change from red to green. Herein, we demonstrate that this concept can be applied to small interfering RNA (siRNA) to monitor RNA integrity in living cells in real time with a remarkable dynamic range and excellent contrast ratios in cellular media. Furthermore, we show that our siRNA-sensors still possess their gene silencing function toward the knockdown of enhanced green fluorescent protein in CHO-K1 cells.
T.Rex Visual Analytics for Transactional Exploration
None
2018-01-16
T.Rex is PNNL's visual analytics tool that specializes in tabular structured data, like you might open with Excel. It's a client-server application, allowing the server to do a lot of the heavy lifting and the client to open spreadsheets with millions of rows. With datasets of that size, especially if you're unfamiliar with the contents, it's very hard to get a good grasp of what's in it using traditional tools. With T.Rex, the multiple views allow you to see categorical, temporal, numerical, relational, and summary data. The interactivity lets you look across your data and see how things relate to each other.
T.Rex Visual Analytics for Transactional Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-07-01
T.Rex is PNNL's visual analytics tool that specializes in tabular structured data, like you might open with Excel. It's a client-server application, allowing the server to do a lot of the heavy lifting and the client to open spreadsheets with millions of rows. With datasets of that size, especially if you're unfamiliar with the contents, it's very hard to get a good grasp of what's in it using traditional tools. With T.Rex, the multiple views allow you to see categorical, temporal, numerical, relational, and summary data. The interactivity lets you look across your data and see how things relate tomore » each other.« less
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
2016-02-17
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Determining GPS average performance metrics
NASA Technical Reports Server (NTRS)
Moore, G. V.
1995-01-01
Analytic and semi-analytic methods are used to show that users of the GPS constellation can expect performance variations based on their location. Specifically, performance is shown to be a function of both altitude and latitude. These results stem from the fact that the GPS constellation is itself non-uniform. For example, GPS satellites are over four times as likely to be directly over Tierra del Fuego than over Hawaii or Singapore. Inevitable performance variations due to user location occur for ground, sea, air and space GPS users. These performance variations can be studied in an average relative sense. A semi-analytic tool which symmetrically allocates GPS satellite latitude belt dwell times among longitude points is used to compute average performance metrics. These metrics include average number of GPS vehicles visible, relative average accuracies in the radial, intrack and crosstrack (or radial, north/south, east/west) directions, and relative average PDOP or GDOP. The tool can be quickly changed to incorporate various user antenna obscuration models and various GPS constellation designs. Among other applications, tool results can be used in studies to: predict locations and geometries of best/worst case performance, design GPS constellations, determine optimal user antenna location and understand performance trends among various users.
Analysis of laparoscopy in trauma.
Villavicencio, R T; Aucar, J A
1999-07-01
The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.
Koch, Stefan; Bueschl, Christoph; Doppler, Maria; Simader, Alexandra; Meng-Reiterer, Jacqueline; Lemmens, Marc; Schuhmacher, Rainer
2016-01-01
Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio (m/z) values and retention times) that serves as a reference, the tool recognizes both m/z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m/z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley. PMID:27827849
Koch, Stefan; Bueschl, Christoph; Doppler, Maria; Simader, Alexandra; Meng-Reiterer, Jacqueline; Lemmens, Marc; Schuhmacher, Rainer
2016-11-02
Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio ( m / z ) values and retention times) that serves as a reference, the tool recognizes both m / z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m / z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...
WebMeV | Informatics Technology for Cancer Research (ITCR)
Web MeV (Multiple-experiment Viewer) is a web/cloud-based tool for genomic data analysis. Web MeV is being built to meet the challenge of exploring large public genomic data set with intuitive graphical interface providing access to state-of-the-art analytical tools.
Srinivas, Nuggehally R
2006-05-01
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.
High Bandwidth Rotary Fast Tool Servos and a Hybrid Rotary/Linear Electromagnetic Actuator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montesanti, Richard Clement
2005-09-01
This thesis describes the development of two high bandwidth short-stroke rotary fast tool servos and the hybrid rotary/linear electromagnetic actuator developed for one of them. Design insights, trade-o® methodologies, and analytical tools are developed for precision mechanical systems, power and signal electronic systems, control systems, normal-stress electromagnetic actuators, and the dynamics of the combined systems.
ERIC Educational Resources Information Center
Thompson, Andrew R.; O'Loughlin, Valerie D.
2015-01-01
Bloom's taxonomy is a resource commonly used to assess the cognitive level associated with course assignments and examination questions. Although widely utilized in educational research, Bloom's taxonomy has received limited attention as an analytical tool in the anatomical sciences. Building on previous research, the Blooming Anatomy Tool (BAT)…
The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing
ERIC Educational Resources Information Center
Imbrenda, Jon-Philip
2016-01-01
Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…
Linguistics and the Study of Literature. Linguistics in the Undergraduate Curriculum, Appendix 4-D.
ERIC Educational Resources Information Center
Steward, Ann Harleman
Linguistics gives the student of literature an analytical tool whose sole purpose is to describe faithfully the workings of language. It provides a theoretical framework, an analytical method, and a vocabulary for communicating its insights--all designed to serve concerns other than literary interpretation and evaluation, but all useful for…