The generation of criteria for selecting analytical tools for landscape management
Marilyn Duffey-Armstrong
1979-01-01
This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...
Median of patient results as a tool for assessment of analytical stability.
Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György
2015-06-15
In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.
Tool to Prioritize Energy Efficiency Investments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farese, P.; Gelman, R.; Hendron, R.
2012-08-01
To provide analytic support of the U.S. Department of Energy's Office of the Building Technology Program (BTP), NREL developed a Microsoft Excel-based tool to provide an open and objective comparison of the hundreds of investment opportunities available to BTP. This tool uses established methodologies to evaluate the energy savings and cost of those savings.
ERIC Educational Resources Information Center
Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew
2015-01-01
Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…
Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-03-22
We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.
Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data
NASA Astrophysics Data System (ADS)
Jern, Mikael
Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
Harnessing Scientific Literature Reports for Pharmacovigilance
Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier
2017-01-01
Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432
Structural Integrity and Durability of Reusable Space Propulsion Systems
NASA Technical Reports Server (NTRS)
1985-01-01
The space shuttle main engine (SSME), a reusable space propulsion system, is discussed. The advances in high pressure oxygen hydrogen rocket technology are reported to establish the basic technology and to develop new analytical tools for the evaluation in reusable rocket systems.
Analytical Model-Based Design Optimization of a Transverse Flux Machine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz
This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variablesmore » that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.« less
Assessing analytical comparability of biosimilars: GCSF as a case study.
Nupur, Neh; Singh, Sumit Kumar; Narula, Gunjan; Rathore, Anurag S
2016-10-01
The biosimilar industry is witnessing an unprecedented growth with the newer therapeutics increasing in complexity over time. A key step towards development of a biosimilar is to establish analytical comparability with the innovator product, which would otherwise affect the safety/efficacy profile of the product. Choosing appropriate analytical tools that can fulfil this objective by qualitatively and/or quantitatively assessing the critical quality attributes (CQAs) of the product is highly critical for establishing equivalence. These CQAs cover the primary and higher order structures of the product, product related variants and impurities, as well as process related impurities, and host cell related impurities. In the present work, we use such an analytical platform for assessing comparability of five approved Granulocyte Colony Stimulating Factor (GCSF) biosimilars (Emgrast, Lupifil, Colstim, Neukine and Grafeel) to the innovator product, Neupogen(®). The comparability studies involve assessing structural homogeneity, identity, secondary structure, and product related modifications. Physicochemical analytical tools include peptide mapping with mass determination, circular dichroism (CD) spectroscopy, reverse phase chromatography (RPC) and size exclusion chromatography (SEC) have been used in this exercise. Bioactivity assessment include comparison of relative potency through in vitro cell proliferation assays. The results from extensive analytical examination offer robust evidence of structural and biological similarity of the products under consideration with the pertinent innovator product. For the most part, the biosimilar drugs were found to be comparable to the innovator drug anomaly that was identified was that three of the biosimilars had a typical variant which was reported as an oxidized species in the literature. But, upon further investigation using RPC-FLD and ESI-MS we found that this is likely a conformational variant of the biotherapeutic been studied. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.
Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat modelsmore » that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.« less
Framework for assessing key variable dependencies in loose-abrasive grinding and polishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J.S.; Aikens, D.M.; Brown, N.J.
1995-12-01
This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradel, Lauren; Endert, Alexander; Koch, Kristen
2013-08-01
Large, high-resolution vertical displays carry the potential to increase the accuracy of collaborative sensemaking, given correctly designed visual analytics tools. From an exploratory user study using a fictional textual intelligence analysis task, we investigated how users interact with the display to construct spatial schemas and externalize information, as well as how they establish shared and private territories. We investigated the space management strategies of users partitioned by type of tool philosophy followed (visualization- or text-centric). We classified the types of territorial behavior exhibited in terms of how the users interacted with information on the display (integrated or independent workspaces). Next,more » we examined how territorial behavior impacted the common ground between the pairs of users. Finally, we offer design suggestions for building future co-located collaborative visual analytics tools specifically for use on large, high-resolution vertical displays.« less
FT-Raman Spectroscopy: A Catalyst for the Raman Explosion?
ERIC Educational Resources Information Center
Chase, Bruce
2007-01-01
The limitations of Fourier transform (FT) Raman spectroscopy, which is used to detect and analyze the scattered radiation, are discussed. FT-Raman has served to revitalize a field that was lagging and the presence of Raman instrumentation as a routine analytical tool is established for the foreseeable future.
Updates in metabolomics tools and resources: 2014-2015.
Misra, Biswapriya B; van der Hooft, Justin J J
2016-01-01
Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Bradie, Johanna; Gianoli, Claudio; He, Jianjun; Lo Curto, Alberto; Stehouwer, Peter; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah
2018-03-01
Non-indigenous species seriously threaten native biodiversity. To reduce establishments, the International Maritime Organization established the Convention for the Control and Management of Ships' Ballast Water and Sediments which limits organism concentrations at discharge under regulation D-2. Most ships will comply by using on-board treatment systems to disinfect their ballast water. Port state control officers will need simple, rapid methods to detect compliance. Appropriate monitoring methods may be dependent on treatment type, since different treatments will affect organisms by a variety of mechanisms. Many indicative tools have been developed, but must be examined to ensure the measured variable is an appropriate signal for the response of the organisms to the applied treatment. We assessed the abilities of multiple analytic tools to rapidly detect the effects of a ballast water treatment system based on UV disinfection. All devices detected a large decrease in the concentrations of vital organisms ≥ 50 μm and organisms < 10 μm (mean 82.7-99.7% decrease across devices), but results were more variable for the ≥ 10 to < 50 μm size class (mean 9.0-99.9% decrease across devices). Results confirm the necessity to choose tools capable of detecting the damage inflicted on living organisms, as examined herein for UV-C treatment systems.
Green Chemistry Metrics with Special Reference to Green Analytical Chemistry.
Tobiszewski, Marek; Marć, Mariusz; Gałuszka, Agnieszka; Namieśnik, Jacek
2015-06-12
The concept of green chemistry is widely recognized in chemical laboratories. To properly measure an environmental impact of chemical processes, dedicated assessment tools are required. This paper summarizes the current state of knowledge in the field of development of green chemistry and green analytical chemistry metrics. The diverse methods used for evaluation of the greenness of organic synthesis, such as eco-footprint, E-Factor, EATOS, and Eco-Scale are described. Both the well-established and recently developed green analytical chemistry metrics, including NEMI labeling and analytical Eco-scale, are presented. Additionally, this paper focuses on the possibility of the use of multivariate statistics in evaluation of environmental impact of analytical procedures. All the above metrics are compared and discussed in terms of their advantages and disadvantages. The current needs and future perspectives in green chemistry metrics are also discussed.
Bio-TDS: bioscience query tool discovery system.
Gnimpieba, Etienne Z; VanDiermen, Menno S; Gustafson, Shayla M; Conn, Bill; Lushbough, Carol M
2017-01-04
Bioinformatics and computational biology play a critical role in bioscience and biomedical research. As researchers design their experimental projects, one major challenge is to find the most relevant bioinformatics toolkits that will lead to new knowledge discovery from their data. The Bio-TDS (Bioscience Query Tool Discovery Systems, http://biotds.org/) has been developed to assist researchers in retrieving the most applicable analytic tools by allowing them to formulate their questions as free text. The Bio-TDS is a flexible retrieval system that affords users from multiple bioscience domains (e.g. genomic, proteomic, bio-imaging) the ability to query over 12 000 analytic tool descriptions integrated from well-established, community repositories. One of the primary components of the Bio-TDS is the ontology and natural language processing workflow for annotation, curation, query processing, and evaluation. The Bio-TDS's scientific impact was evaluated using sample questions posed by researchers retrieved from Biostars, a site focusing on BIOLOGICAL DATA ANALYSIS: The Bio-TDS was compared to five similar bioscience analytic tool retrieval systems with the Bio-TDS outperforming the others in terms of relevance and completeness. The Bio-TDS offers researchers the capacity to associate their bioscience question with the most relevant computational toolsets required for the data analysis in their knowledge discovery process. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
A Structural Model of Prospective Science Teachers' Nature of Science Views
ERIC Educational Resources Information Center
Mugaloglu, Ebru Z.; Bayram, Hale
2010-01-01
This study aims to establish a viable structural model of prospective science teachers' nature of science (NOS) views, which could be used as an analytical tool for understanding the complex relationships between prospective teachers' conceptions of NOS and factors possibly affecting their conceptions. In order to construct such a model, likely…
Analytical aspects of plant metabolite profiling platforms: current standings and future aims.
Seger, Christoph; Sturm, Sonja
2007-02-01
Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.
Singular value decomposition for the truncated Hilbert transform
NASA Astrophysics Data System (ADS)
Katsevich, A.
2010-11-01
Starting from a breakthrough result by Gelfand and Graev, inversion of the Hilbert transform became a very important tool for image reconstruction in tomography. In particular, their result is useful when the tomographic data are truncated and one deals with an interior problem. As was established recently, the interior problem admits a stable and unique solution when some a priori information about the object being scanned is available. The most common approach to solving the interior problem is based on converting it to the Hilbert transform and performing analytic continuation. Depending on what type of tomographic data are available, one gets different Hilbert inversion problems. In this paper, we consider two such problems and establish singular value decomposition for the operators involved. We also propose algorithms for performing analytic continuation.
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.
ERIC Educational Resources Information Center
Wyman, Steven K.; And Others
This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…
ERIC Educational Resources Information Center
Geisler, Cheryl
2018-01-01
Coding, the analytic task of assigning codes to nonnumeric data, is foundational to writing research. A rich discussion of methodological pluralism has established the foundational importance of systematicity in the task of coding, but less attention has been paid to the equally important commitment to language complexity. Addressing the interplay…
NASA Technical Reports Server (NTRS)
Wang, Peter Hor-Ching
1996-01-01
This study is a continuation of the summer research of 1995 NASA/ASEE Summer Faculty Fellowship Program. This effort is to provide the infrastructure of an integrated Virtual Reality (VR) environment for the International Space Welding Experiment (ISWE) Analytical Tool and Trainer and the Microgravity Science Glovebox (MSG) Analytical Tool study. Due to the unavailability of the MSG CAD files and the 3D-CAD converter, little was done to the MSG study. However, the infrastructure of the integrated VR environment for ISWE is capable of performing the MSG study when the CAD files become available. Two primary goals are established for this research. First, the essential peripheral devices for an integrated VR environment will be studied and developed for the ISWE and MSG studies. Secondly, the training of the flight crew (astronaut) in general orientation, procedures, and location, orientation, and sequencing of the welding samples and tools are built into the VR system for studying the welding process and training the astronaut.
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel
2016-09-07
We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson Khosah
2007-07-31
Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
Influence of Wake Models on Calculated Tiltrotor Aerodynamics
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2001-01-01
The tiltrotor aircraft configuration has the potential to revolutionize air transportation by providing an economical combination of vertical take-off and landing capability with efficient, high-speed cruise flight. To achieve this potential it is necessary to have validated analytical tools that will support future tiltrotor aircraft development. These analytical tools must calculate tiltrotor aeromechanical behavior, including performance, structural loads, vibration, and aeroelastic stability, with an accuracy established by correlation with measured tiltrotor data. The recent test of the Tilt Rotor Aeroacoustic Model (TRAM) with a single,l/4-scale V-22 rotor in the German-Dutch Wind Tunnel (DNW) provides an extensive set of aeroacoustic, performance, and structural loads data. This paper will examine the influence of wake models on calculated tiltrotor aerodynamics, comparing calculations of performance and airloads with TRAM DNW measurements. The calculations will be performed using the comprehensive analysis CAMRAD II.
Cretini, K.F.; Steyer, G.D.
2011-01-01
The Coastwide Reference Monitoring System (CRMS) program was established to assess the effectiveness of individual coastal restoration projects and the cumulative effects of multiple projects at regional and coastwide scales. In order to make these assessments, analytical teams have been assembled for each of the primary data types sampled under the CRMS program, including vegetation, hydrology, landscape, and soils. These teams consist of scientists and support staff from the U.S. Geological Survey and other Federal agencies, the Louisiana Office of Coastal Protection and Restoration, and university academics. Each team is responsible for developing or identifying parameters, indices, or tools that can be used to assess coastal wetlands at various scales. The CRMS Vegetation Analytical Team has developed a Floristic Quality Index for coastal Louisiana to determine the quality of a wetland based on its plant species composition and abundance.
Johner, S A; Boeing, H; Thamm, M; Remer, T
2015-12-01
The assessment of urinary excretion of specific nutrients (e.g. iodine, sodium) is frequently used to monitor a population's nutrient status. However, when only spot urines are available, always a risk of hydration-status-dependent dilution effects and related misinterpretations exists. The aim of the present study was to establish mean values of 24-h creatinine excretion widely applicable for an appropriate estimation of 24-h excretion rates of analytes from spot urines in adults. Twenty-four-hour creatinine excretion from the formerly representative cross-sectional German VERA Study (n=1463, 20-79 years old) was analysed. Linear regression analysis was performed to identify the most important influencing factors of creatinine excretion. In a subsample of the German DONALD Study (n=176, 20-29 years old), the applicability of the 24-h creatinine excretion values of VERA for the estimation of 24-h sodium and iodine excretion from urinary concentration measurements was tested. In the VERA Study, mean 24-h creatinine excretion was 15.4 mmol per day in men and 11.1 mmol per day in women, significantly dependent on sex, age, body weight and body mass index. Based on the established 24-h creatinine excretion values, mean 24-h iodine and sodium excretions could be estimated from respective analyte/creatinine concentrations, with average deviations <10% compared with the actual 24-h means. The present mean values of 24-h creatinine excretion are suggested as a useful tool to derive realistic hydration-status-independent average 24-h excretion rates from urinary analyte/creatinine ratios. We propose to apply these creatinine reference means routinely in biomarker-based studies aiming at characterizing the nutrient or metabolite status of adult populations by simply measuring metabolite/creatinine ratios in spot urines.
Analytical tools for characterizing biopharmaceuticals and the implications for biosimilars
Berkowitz, Steven A.; Engen, John R.; Mazzeo, Jeffrey R.; Jones, Graham B.
2013-01-01
Biologics such as monoclonal antibodies are much more complex than small-molecule drugs, which raises challenging questions for the development and regulatory evaluation of follow-on versions of such biopharmaceutical products (also known as biosimilars) and their clinical use once patent protection for the pioneering biologic has expired. With the recent introduction of regulatory pathways for follow-on versions of complex biologics, the role of analytical technologies in comparing biosimilars with the corresponding reference product is attracting substantial interest in establishing the development requirements for biosimilars. Here, we discuss the current state of the art in analytical technologies to assess three characteristics of protein biopharmaceuticals that regulatory authorities have identified as being important in development strategies for biosimilars: post-translational modifications, three-dimensional structures and protein aggregation. PMID:22743980
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
Technical pre-analytical effects on the clinical biochemistry of Atlantic salmon (Salmo salar L.).
Braceland, M; Houston, K; Ashby, A; Matthews, C; Haining, H; Rodger, H; Eckersall, P D
2017-01-01
Clinical biochemistry has long been utilized in human and veterinary medicine as a vital diagnostic tool, but despite occasional studies showing its usefulness in monitoring health status in Atlantic salmon (Salmo salar L.), it has not yet been widely utilized within the aquaculture industry. This is due, in part, to a lack of an agreed protocol for collection and processing of blood prior to analysis. Moreover, while the analytical phase of clinical biochemistry is well controlled, there is a growing understanding that technical pre-analytical variables can influence analyte concentrations or activities. In addition, post-analytical interpretation of treatment effects is variable in the literature, thus making the true effect of sample treatment hard to evaluate. Therefore, a number of pre-analytical treatments have been investigated to examine their effect on analyte concentrations and activities. In addition, reference ranges for salmon plasma biochemical analytes have been established to inform veterinary practitioners and the aquaculture industry of the importance of clinical biochemistry in health and disease monitoring. Furthermore, a standardized protocol for blood collection has been proposed. © 2016 The Authors Journal of Fish Diseases Published by John Wiley & Sons Ltd.
Characterization and measurement of polymer wear
NASA Technical Reports Server (NTRS)
Buckley, D. H.; Aron, P. R.
1984-01-01
Analytical tools which characterize the polymer wear process are discussed. The devices discussed include: visual observation of polymer wear with SEM, the quantification with surface profilometry and ellipsometry, to study the chemistry with AES, XPS and SIMS, to establish interfacial polymer orientation and accordingly bonding with QUARTIR, polymer state with Raman spectroscopy and stresses that develop in polymer films using a X-ray double crystal camera technique.
Valdez, Rodolfo; Yoon, Paula W; Qureshi, Nadeem; Green, Ridgely Fisk; Khoury, Muin J
2010-01-01
Family history is a risk factor for many chronic diseases, including cancer, cardiovascular disease, and diabetes. Professional guidelines usually include family history to assess health risk, initiate interventions, and motivate behavioral changes. The advantages of family history over other genomic tools include a lower cost, greater acceptability, and a reflection of shared genetic and environmental factors. However, the utility of family history in public health has been poorly explored. To establish family history as a public health tool, it needs to be evaluated within the ACCE framework (analytical validity; clinical validity; clinical utility; and ethical, legal, and social issues). Currently, private and public organizations are developing tools to collect standardized family histories of many diseases. Their goal is to create family history tools that have decision support capabilities and are compatible with electronic health records. These advances will help realize the potential of family history as a public health tool.
Wegner, S; Bauer, J I; Dietrich, R; Märtlbauer, E; Usleber, E; Gottschalk, C; Gross, M
2017-02-01
A simplified method to produce specific polyclonal rabbit antibodies against sterigmatocystin (STC) was established, using a STC-glycolic acid-ether derivative (STC-GE) conjugated to keyhole limpet haemocyanin (immunogen). The competitive direct enzyme immunoassay (EIA) established for STC had a detection limit (20% binding inhibition) of 130 pg ml -1 . The test was highly specific for STC, with minor cross-reactivity with O-methylsterigmatocystin (OMSTC, 0·87%) and negligible reactivity with aflatoxins (<0·02%). STC-EIA was used in combination with a previously developed specific EIA for aflatoxins (<0·1% cross-reactivity with STC and OMSTC), to study the STC/aflatoxin production profiles of reference strains of Aspergillus species. This immunochemotaxonomic procedure was found to be a convenient tool to identify STC- or aflatoxin-producing strains. The carcinogenic mycotoxin sterigmatocystin (STC) is produced by several Aspergillus species, either alone or together with aflatoxins. Here, we report a very simple and straightforward procedure to obtain highly sensitive and specific anti-STC antibodies, and their use in the first ever real STC-specific competitive direct enzyme immunoassay (EIA). In combination with a previous EIA for aflatoxins, this study for the first time demonstrates the potential of a STC/aflatoxin EIA pair for what is branded as 'immunochemotaxonomic' identification of mycotoxigenic Aspergillus species. This new analytical tool enhances analytical possibilities for differential analysis of STC and aflatoxins. © 2016 The Society for Applied Microbiology.
A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.
Płotka-Wasylka, J
2018-05-01
A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.
Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie
2018-01-01
As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.
NASA Astrophysics Data System (ADS)
Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie
2018-01-01
As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson P. Khosah; Frank T. Alex
2007-02-11
Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson P. Khosah; Charles G. Crawford
2006-02-11
Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson P. Khosah; Charles G. Crawford
Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1 development activities.« less
Suba, Dávid; Urbányi, Zoltán; Salgó, András
2016-10-01
Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
The Barcode of Life Data Portal: Bridging the Biodiversity Informatics Divide for DNA Barcoding
Sarkar, Indra Neil; Trizna, Michael
2011-01-01
With the volume of molecular sequence data that is systematically being generated globally, there is a need for centralized resources for data exploration and analytics. DNA Barcode initiatives are on track to generate a compendium of molecular sequence–based signatures for identifying animals and plants. To date, the range of available data exploration and analytic tools to explore these data have only been available in a boutique form—often representing a frustrating hurdle for many researchers that may not necessarily have resources to install or implement algorithms described by the analytic community. The Barcode of Life Data Portal (BDP) is a first step towards integrating the latest biodiversity informatics innovations with molecular sequence data from DNA barcoding. Through establishment of community driven standards, based on discussion with the Data Analysis Working Group (DAWG) of the Consortium for the Barcode of Life (CBOL), the BDP provides an infrastructure for incorporation of existing and next-generation DNA barcode analytic applications in an open forum. PMID:21818249
Reproducibility studies for experimental epitope detection in macrophages (EDIM).
Japink, Dennis; Nap, Marius; Sosef, Meindert N; Nelemans, Patty J; Coy, Johannes F; Beets, Geerard; von Meyenfeldt, Maarten F; Leers, Math P G
2014-05-01
We have recently described epitope detection in macrophages (EDIM) by flow cytometry. This is a promising tool for the diagnosis and follow-up of malignancies. However, biological and technical validation is warranted before clinical applicability can be explored. The pre-analytic and analytic phases were investigated. Five different aspects were assessed: blood sample stability, intra-individual variability in healthy persons, intra-assay variation, inter-assay variation and assay transferability. The post-analytic phase was already partly standardized and described in an earlier study. The outcomes in the pre-analytic phase showed that samples are stable for 24h after venipuncture. Biological variation over time was similar to that of serum tumor marker assays; each patient has a baseline value. Intra-assay variation showed good reproducibility, while inter-assay variation showed reproducibility similar to that of to established serum tumor marker assays. Furthermore, the assay showed excellent transferability between analyzers. Under optimal analytic conditions the EDIM method is technically stable, reproducible and transferable. Biological variation over time needs further assessment in future work. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Panontin, Tina; Carvalho, Robert; Keller, Richard
2004-01-01
Contents include the folloving:Overview of the Application; Input Data; Analytical Process; Tool's Output; and Application of the Results of the Analysis.The tool enables the first element through a Web-based application that can be accessed by distributed teams to store and retrieve any type of digital investigation material in a secure environment. The second is accomplished by making the relationships between information explicit through the use of a semantic network-a structure that literally allows an investigator or team to "connect -the-dots." The third element, the significance of the correlated information, is established through causality and consistency tests using a number of different methods embedded within the tool, including fault trees, event sequences, and other accident models. And finally, the evidence gathered and structured within the tool can be directly, electronically archived to preserve the evidence and investigative reasoning.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO
NASA Technical Reports Server (NTRS)
Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.
2016-01-01
A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.
Electronic laboratory notebooks progress and challenges in implementation.
Machina, Hari K; Wild, David J
2013-08-01
Electronic laboratory notebooks (ELNs) are increasingly replacing paper notebooks in life science laboratories, including those in industry, academic settings, and hospitals. ELNs offer significant advantages over paper notebooks, but adopting them in a predominantly paper-based environment is usually disruptive. The benefits of ELN increase when they are integrated with other laboratory informatics tools such as laboratory information management systems, chromatography data systems, analytical instrumentation, and scientific data management systems, but there is no well-established path for effective integration of these tools. In this article, we review and evaluate some of the approaches that have been taken thus far and also some radical new methods of integration that are emerging.
Investigating Analytic Tools for e-Book Design in Early Literacy Learning
ERIC Educational Resources Information Center
Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah
2009-01-01
Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…
Yan, Guanyong; Wang, Xiangzhao; Li, Sikun; Yang, Jishuo; Xu, Dongbo; Erdmann, Andreas
2014-03-10
We propose an in situ aberration measurement technique based on an analytical linear model of through-focus aerial images. The aberrations are retrieved from aerial images of six isolated space patterns, which have the same width but different orientations. The imaging formulas of the space patterns are investigated and simplified, and then an analytical linear relationship between the aerial image intensity distributions and the Zernike coefficients is established. The linear relationship is composed of linear fitting matrices and rotation matrices, which can be calculated numerically in advance and utilized to retrieve Zernike coefficients. Numerical simulations using the lithography simulators PROLITH and Dr.LiTHO demonstrate that the proposed method can measure wavefront aberrations up to Z(37). Experiments on a real lithography tool confirm that our method can monitor lens aberration offset with an accuracy of 0.7 nm.
Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R
2013-06-01
Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.
The Geoinformatica free and open source software stack
NASA Astrophysics Data System (ADS)
Jolma, A.
2012-04-01
The Geoinformatica free and open source software (FOSS) stack is based mainly on three established FOSS components, namely GDAL, GTK+, and Perl. GDAL provides access to a very large selection of geospatial data formats and data sources, a generic geospatial data model, and a large collection of geospatial analytical and processing functionality. GTK+ and the Cairo graphics library provide generic graphics and graphical user interface capabilities. Perl is a programming language, for which there is a very large set of FOSS modules for a wide range of purposes and which can be used as an integrative tool for building applications. In the Geoinformatica stack, data storages such as FOSS RDBMS PostgreSQL with its geospatial extension PostGIS can be used below the three above mentioned components. The top layer of Geoinformatica consists of a C library and several Perl modules. The C library comprises a general purpose raster algebra library, hydrological terrain analysis functions, and visualization code. The Perl modules define a generic visualized geospatial data layer and subclasses for raster and vector data and graphs. The hydrological terrain functions are already rather old and they suffer for example from the requirement of in-memory rasters. Newer research conducted using the platform include basic geospatial simulation modeling, visualization of ecological data, linking with a Bayesian network engine for spatial risk assessment in coastal areas, and developing standards-based distributed water resources information systems in Internet. The Geoinformatica stack constitutes a platform for geospatial research, which is targeted towards custom analytical tools, prototyping and linking with external libraries. Writing custom analytical tools is supported by the Perl language and the large collection of tools that are available especially in GDAL and Perl modules. Prototyping is supported by the GTK+ library, the GUI tools, and the support for object-oriented programming in Perl. New feature types, geospatial layer classes, and tools as extensions with specific features can be defined, used, and studied. Linking with external libraries is possible using the Perl foreign function interface tools or with generic tools such as Swig. We are interested in implementing and testing linking Geoinformatica with existing or new more specific hydrological FOSS.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
Developing Healthcare Data Analytics APPs with Open Data Science Tools.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong
2017-01-01
Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.
Glucose Biosensors: An Overview of Use in Clinical Practice
Yoo, Eun-Hyung; Lee, Soo-Youn
2010-01-01
Blood glucose monitoring has been established as a valuable tool in the management of diabetes. Since maintaining normal blood glucose levels is recommended, a series of suitable glucose biosensors have been developed. During the last 50 years, glucose biosensor technology including point-of-care devices, continuous glucose monitoring systems and noninvasive glucose monitoring systems has been significantly improved. However, there continues to be several challenges related to the achievement of accurate and reliable glucose monitoring. Further technical improvements in glucose biosensors, standardization of the analytical goals for their performance, and continuously assessing and training lay users are required. This article reviews the brief history, basic principles, analytical performance, and the present status of glucose biosensors in the clinical practice. PMID:22399892
Mann, Megan A; Helfrick, John C; Bottomley, Lawrence A
2014-08-19
Theory for cyclic square wave voltammetry of quasireversible electron transfer reactions is presented and experimentally verified. The impact of empirical parameters on the shape of the current-voltage curve is examined. From the trends, diagnostic criteria enabling the use of this waveform as a tool for mechanistic analysis of electrode reaction processes are presented. These criteria were experimentally confirmed using Eu(3+)/Eu(2+), a well-established quasireversible analyte. Using cyclic square wave voltammetry, both the electron transfer coefficient and rate were calculated for this analyte and found to be in excellent agreement with literature. When properly applied, these criteria will enable nonexperts in voltammetry to assign the electrode reaction mechanism and accurately measure electrode reaction kinetics.
Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study
ERIC Educational Resources Information Center
Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek
2013-01-01
Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…
Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek
2016-05-01
This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.
Wong, William CW; Cheung, Catherine SK; Hart, Graham J
2008-01-01
Background Systematic reviews based on the critical appraisal of observational and analytic studies on HIV prevalence and risk factors for HIV transmission among men having sex with men are very useful for health care decisions and planning. Such appraisal is particularly difficult, however, as the quality assessment tools available for use with observational and analytic studies are poorly established. Methods We reviewed the existing quality assessment tools for systematic reviews of observational studies and developed a concise quality assessment checklist to help standardise decisions regarding the quality of studies, with careful consideration of issues such as external and internal validity. Results A pilot version of the checklist was developed based on epidemiological principles, reviews of study designs, and existing checklists for the assessment of observational studies. The Quality Assessment Tool for Systematic Reviews of Observational Studies (QATSO) Score consists of five items: External validity (1 item), reporting (2 items), bias (1 item) and confounding factors (1 item). Expert opinions were sought and it was tested on manuscripts that fulfil the inclusion criteria of a systematic review. Like all assessment scales, QATSO may oversimplify and generalise information yet it is inclusive, simple and practical to use, and allows comparability between papers. Conclusion A specific tool that allows researchers to appraise and guide study quality of observational studies is developed and can be modified for similar studies in the future. PMID:19014686
Lipidomics from an analytical perspective.
Sandra, Koen; Sandra, Pat
2013-10-01
The global non-targeted analysis of various biomolecules in a variety of sample sources gained momentum in recent years. Defined as the study of the full lipid complement of cells, tissues and organisms, lipidomics is currently evolving out of the shadow of the more established omics sciences including genomics, transcriptomics, proteomics and metabolomics. In analogy to the latter, lipidomics has the potential to impact on biomarker discovery, drug discovery/development and system knowledge, amongst others. The tools developed by lipid researchers in the past, complemented with the enormous advancements made in recent years in mass spectrometry and chromatography, and the implementation of sophisticated (bio)-informatics tools form the basis of current lipidomics technologies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Scalable Visual Analytics of Massive Textual Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.
2007-04-01
This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Llorente-Mirandes, Toni; Rubio, Roser; López-Sánchez, José Fermín
2017-01-01
Here we review recent developments in analytical proposals for the assessment of inorganic arsenic (iAs) content in food products. Interest in the determination of iAs in products for human consumption such as food commodities, wine, and seaweed among others is fueled by the wide recognition of its toxic effects on humans, even at low concentrations. Currently, the need for robust and reliable analytical methods is recognized by various international safety and health agencies, and by organizations in charge of establishing acceptable tolerance levels of iAs in food. This review summarizes the state of the art of analytical methods while highlighting tools for the assessment of quality assessment of the results, such as the production and evaluation of certified reference materials (CRMs) and the availability of specific proficiency testing (PT) programmes. Because the number of studies dedicated to the subject of this review has increased considerably over recent years, the sources consulted and cited here are limited to those from 2010 to the end of 2015.
Visual analytics for aviation safety: A collaborative approach to sensemaking
NASA Astrophysics Data System (ADS)
Wade, Andrew
Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.
Manafian Heris, Jalil; Lakestani, Mehrdad
2014-01-01
We establish exact solutions including periodic wave and solitary wave solutions for the integrable sixth-order Drinfeld-Sokolov-Satsuma-Hirota system. We employ this system by using a generalized (G'/G)-expansion and the generalized tanh-coth methods. These methods are developed for searching exact travelling wave solutions of nonlinear partial differential equations. It is shown that these methods, with the help of symbolic computation, provide a straightforward and powerful mathematical tool for solving nonlinear partial differential equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, C G; Mathews, S
2006-09-07
Current regulatory schemes use generic or industrial sector specific benchmarks to evaluate the quality of industrial stormwater discharges. While benchmarks can be a useful tool for facility stormwater managers in evaluating the quality stormwater runoff, benchmarks typically do not take into account site-specific conditions, such as: soil chemistry, atmospheric deposition, seasonal changes in water source, and upstream land use. Failing to account for these factors may lead to unnecessary costs to trace a source of natural variation, or potentially missing a significant local water quality problem. Site-specific water quality thresholds, established upon the statistical evaluation of historic data take intomore » account these factors, are a better tool for the direct evaluation of runoff quality, and a more cost-effective trigger to investigate anomalous results. Lawrence Livermore National Laboratory (LLNL), a federal facility, established stormwater monitoring programs to comply with the requirements of the industrial stormwater permit and Department of Energy orders, which require the evaluation of the impact of effluent discharges on the environment. LLNL recognized the need to create a tool to evaluate and manage stormwater quality that would allow analysts to identify trends in stormwater quality and recognize anomalous results so that trace-back and corrective actions could be initiated. LLNL created the site-specific water quality threshold tool to better understand the nature of the stormwater influent and effluent, to establish a technical basis for determining when facility operations might be impacting the quality of stormwater discharges, and to provide ''action levels'' to initiate follow-up to analytical results. The threshold criteria were based on a statistical analysis of the historic stormwater monitoring data and a review of relevant water quality objectives.« less
Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael
2009-06-01
Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.
Quantitative characterization of edge enhancement in phase contrast x-ray imaging.
Monnin, P; Bulling, S; Hoszowska, J; Valley, J F; Meuli, R; Verdun, F R
2004-06-01
The aim of this study was to model the edge enhancement effect in in-line holography phase contrast imaging. A simple analytical approach was used to quantify refraction and interference contrasts in terms of beam energy and imaging geometry. The model was applied to predict the peak intensity and frequency of the edge enhancement for images of cylindrical fibers. The calculations were compared with measurements, and the relationship between the spatial resolution of the detector and the amplitude of the phase contrast signal was investigated. Calculations using the analytical model were in good agreement with experimental results for nylon, aluminum and copper wires of 50 to 240 microm diameter, and with numerical simulations based on Fresnel-Kirchhoff theory. A relationship between the defocusing distance and the pixel size of the image detector was established. This analytical model is a useful tool for optimizing imaging parameters in phase contrast in-line holography, including defocusing distance, detector resolution and beam energy.
Microscale technology and biocatalytic processes: opportunities and challenges for synthesis.
Wohlgemuth, Roland; Plazl, Igor; Žnidaršič-Plazl, Polona; Gernaey, Krist V; Woodley, John M
2015-05-01
Despite the expanding presence of microscale technology in chemical synthesis and energy production as well as in biomedical devices and analytical and diagnostic tools, its potential in biocatalytic processes for pharmaceutical and fine chemicals, as well as related industries, has not yet been fully exploited. The aim of this review is to shed light on the strategic advantages of this promising technology for the development and realization of biocatalytic processes and subsequent product recovery steps, demonstrated with examples from the literature. Constraints, opportunities, and the future outlook for the implementation of these key green engineering methods and the role of supporting tools such as mathematical models to establish sustainable production processes are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
CIMAROSTI, HELENA; HENLEY, JEREMY M.
2012-01-01
It is well established that brain ischemia can cause neuronal death via different signaling cascades. The relative importance and interrelationships between these pathways, however, remain poorly understood. Here is presented an overview of studies using oxygen-glucose deprivation of organotypic hippocampal slice cultures to investigate the molecular mechanisms involved in ischemia. The culturing techniques, setup of the oxygen-glucose deprivation model, and analytical tools are reviewed. The authors focus on SUMOylation, a posttranslational protein modification that has recently been implicated in ischemia from whole animal studies as an example of how these powerful tools can be applied and could be of interest to investigate the molecular pathways underlying ischemic cell death. PMID:19029060
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Analytical Tools in School Finance Reform.
ERIC Educational Resources Information Center
Johns, R. L.
This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…
Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan
2015-12-01
Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.
Analysis of Advanced Rotorcraft Configurations
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2000-01-01
Advanced rotorcraft configurations are being investigated with the objectives of identifying vehicles that are larger, quieter, and faster than current-generation rotorcraft. A large rotorcraft, carrying perhaps 150 passengers, could do much to alleviate airport capacity limitations, and a quiet rotorcraft is essential for community acceptance of the benefits of VTOL operations. A fast, long-range, long-endurance rotorcraft, notably the tilt-rotor configuration, will improve rotorcraft economics through productivity increases. A major part of the investigation of advanced rotorcraft configurations consists of conducting comprehensive analyses of vehicle behavior for the purpose of assessing vehicle potential and feasibility, as well as to establish the analytical models required to support the vehicle development. The analytical work of FY99 included applications to tilt-rotor aircraft. Tilt Rotor Aeroacoustic Model (TRAM) wind tunnel measurements are being compared with calculations performed by using the comprehensive analysis tool (Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD 11)). The objective is to establish the wing and wake aerodynamic models that are required for tilt-rotor analysis and design. The TRAM test in the German-Dutch Wind Tunnel (DNW) produced extensive measurements. This is the first test to encompass air loads, performance, and structural load measurements on tilt rotors, as well as acoustic and flow visualization data. The correlation of measurements and calculations includes helicopter-mode operation (performance, air loads, and blade structural loads), hover (performance and air loads), and airplane-mode operation (performance).
Bergues Pupo, Ana E; Reyes, Juan Bory; Bergues Cabrales, Luis E; Bergues Cabrales, Jesús M
2011-09-24
Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections.
Duality and symmetry lost in solid mechanics
NASA Astrophysics Data System (ADS)
Bui, Huy Duong
2008-01-01
Some conservation laws in Solids and Fracture Mechanics present a lack of symmetry between kinematic and dynamic variables. It is shown that Duality is the right tool to re-establish the symmetry between equations and variables and to provide conservation laws of the pure divergence type which provide true path independent integrals. The loss of symmetry of some energetic expressions is exploited to derive a new method for solving some inverse problems. In particular, the earthquake inverse problem is solved analytically. To cite this article: H.D. Bui, C. R. Mecanique 336 (2008).
Optofluidic two-dimensional grating volume refractive index sensor.
Sarkar, Anirban; Shivakiran Bhaktha, B N; Khastgir, Sugata Pratik
2016-09-10
We present an optofluidic reservoir with a two-dimensional grating for a lab-on-a-chip volume refractive index sensor. The observed diffraction pattern from the device resembles the analytically obtained fringe pattern. The change in the diffraction pattern has been monitored in the far-field for fluids with different refractive indices. Reliable measurements of refractive index variations, with an accuracy of 6×10-3 refractive index units, for different fluids establishes the optofluidic device as a potential on-chip tool for monitoring dynamic refractive index changes.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1983-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
NASA transmission research and its probable effects on helicopter transmission design
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.; Coy, J. J.; Townsend, D. P.
1984-01-01
Transmissions studied for application to helicopters in addition to the more conventional geared transmissions include hybrid (traction/gear), bearingless planetary, and split torque transmissions. Research is being performed to establish the validity of analysis and computer codes developed to predict the performance, efficiency, life, and reliability of these transmissions. Results of this research should provide the transmission designer with analytical tools to design for minimum weight and noise with maximum life and efficiency. In addition, the advantages and limitations of drive systems as well as the more conventional systems will be defined.
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2011 CFR
2011-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2014 CFR
2014-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2012 CFR
2012-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2013 CFR
2013-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
A Progressive Approach to Teaching Analytics in the Marketing Curriculum
ERIC Educational Resources Information Center
Liu, Yiyuan; Levin, Michael A.
2018-01-01
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…
Aarsand, Aasne K; Villanger, Jørild H; Støle, Egil; Deybach, Jean-Charles; Marsden, Joanne; To-Figueras, Jordi; Badminton, Mike; Elder, George H; Sandberg, Sverre
2011-11-01
The porphyrias are a group of rare metabolic disorders whose diagnosis depends on identification of specific patterns of porphyrin precursor and porphyrin accumulation in urine, blood, and feces. Diagnostic tests for porphyria are performed by specialized laboratories in many countries. Data regarding the analytical and diagnostic performance of these laboratories are scarce. We distributed 5 sets of multispecimen samples from different porphyria patients accompanied by clinical case histories to 18-21 European specialist porphyria laboratories/centers as part of a European Porphyria Network organized external analytical and postanalytical quality assessment (EQA) program. The laboratories stated which analyses they would normally have performed given the case histories and reported results of all porphyria-related analyses available, interpretative comments, and diagnoses. Reported diagnostic strategies initially showed considerable diversity, but the number of laboratories applying adequate diagnostic strategies increased during the study period. We found an average interlaboratory CV of 50% (range 12%-152%) for analytes in absolute concentrations. Result normalization by forming ratios to the upper reference limits did not reduce this variation. Sixty-five percent of reported results were within biological variation-based analytical quality specifications. Clinical interpretation of the obtained analytical results was accurate, and most laboratories established the correct diagnosis in all distributions. Based on a case-based EQA scheme, variations were apparent in analytical and diagnostic performance between European specialist porphyria laboratories. Our findings reinforce the use of EQA schemes as an essential tool to assess both analytical and diagnostic processes and thereby to improve patient care in rare diseases.
2004-01-01
Abstract Pancreatitis is recognized as an important cause for morbidity and mortality in cats, but diagnosis remains difficult in many cases. As a first step in trying to identify a better diagnostic tool for feline pancreatitis the objective of this project was to develop and analytically validate a radioimmunoassay for the measurement of feline pancreatic lipase immunoreactivity (fPLI). Feline pancreatic lipase (fPL) was purified from pancreatic tissue and antiserum against fPL was raised in rabbits. Tracer was produced by iodination of fPL using the chloramine T method. A radioimmunoassay was established and analytically validated by determination of sensitivity, dilutional parallelism, spiking recovery, intra-assay variability, and interassay variability. A control range for fPLI in cat serum was established from 30 healthy cats using the central 95th percentile. The sensitivity of the assay was 1.2 μg/L. Observed to expected ratios for serial dilutions ranged from 98.8% to 164.3% for 3 different serum samples. Observed to expected ratios for spiking recovery ranged from 76.9% to 147.6% for 3 different serum samples. Coefficients of variation for intra- and interassay variability for 4 different serum samples were 10.1%, 4.5%, 2.2%, and 3.9% and 24.4%, 15.8%, 16.6%, and 21.3%, respectively. A reference range for fPLI was established as 1.2 to 3.8 μg/L. We conclude that the assay described is sensitive, accurate, and precise with limited linearity in the lower and limited reproducibility in the lower and higher end of the working range. Further studies to evaluate the clinical usefulness of this assay are needed and in progress. PMID:15581227
Wang, Pei; Zhang, Hui; Yang, Hailong; Nie, Lei; Zang, Hengchang
2015-02-25
Near-infrared (NIR) spectroscopy has been developed into an indispensable tool for both academic research and industrial quality control in a wide field of applications. The feasibility of NIR spectroscopy to monitor the concentration of puerarin, daidzin, daidzein and total isoflavonoid (TIF) during the extraction process of kudzu (Pueraria lobata) was verified in this work. NIR spectra were collected in transmission mode and pretreated with smoothing and derivative. Partial least square regression (PLSR) was used to establish calibration models. Three different variable selection methods, including correlation coefficient method, interval partial least squares (iPLS), and successive projections algorithm (SPA) were performed and compared with models based on all of the variables. The results showed that the approach was very efficient and environmentally friendly for rapid determination of the four quality indices (QIs) in the kudzu extraction process. This method established may have the potential to be used as a process analytical technological (PAT) tool in the future. Copyright © 2014 Elsevier B.V. All rights reserved.
Experimental and analytical tools for evaluation of Stirling engine rod seal behavior
NASA Technical Reports Server (NTRS)
Krauter, A. I.; Cheng, H. S.
1979-01-01
The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.
Analytics for Cyber Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd.; Kolda, Tamara Gibson
2011-06-01
This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.
Solar Data and Tools: Resources for Researchers, Industry, and Developers
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-04-01
In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.
Mining Mathematics in Textbook Lessons
ERIC Educational Resources Information Center
Ronda, Erlina; Adler, Jill
2017-01-01
In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…
Fire behavior modeling-a decision tool
Jack Cohen; Bill Bradshaw
1986-01-01
The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...
Guidance for the Design and Adoption of Analytic Tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bandlow, Alisa
2015-12-01
The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.
Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen
2015-11-10
An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.
Narang, Ajit S; Sheverev, Valery; Freeman, Tim; Both, Douglas; Stepaniuk, Vadim; Delancy, Michael; Millington-Smith, Doug; Macias, Kevin; Subramanian, Ganeshkumar
2016-01-01
Drag flow force (DFF) sensor that measures the force exerted by wet mass in a granulator on a thin cylindrical probe was shown as a promising process analytical technology for real-time in-line high-resolution monitoring of wet mass consistency during high shear wet granulation. Our previous studies indicated that this process analytical technology tool could be correlated to granulation end point established independently through drug product critical quality attributes. In this study, the measurements of flow force by a DFF sensor, taken during wet granulation of 3 placebo formulations with different binder content, are compared with concurrent at line FT4 Powder Rheometer characterization of wet granules collected at different time points of the processing. The wet mass consistency measured by the DFF sensor correlated well with the granulation's resistance to flow and interparticulate interactions as measured by FT4 Powder Rheometer. This indicated that the force pulse magnitude measured by the DFF sensor was indicative of fundamental material properties (e.g., shear viscosity and granule size/density), as they were changing during the granulation process. These studies indicate that DFF sensor can be a valuable tool for wet granulation formulation and process development and scale up, as well as for routine monitoring and control during manufacturing. Copyright © 2016. Published by Elsevier Inc.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
Phosphorescent nanosensors for in vivo tracking of histamine levels.
Cash, Kevin J; Clark, Heather A
2013-07-02
Continuously tracking bioanalytes in vivo will enable clinicians and researchers to profile normal physiology and monitor diseased states. Current in vivo monitoring system designs are limited by invasive implantation procedures and biofouling, limiting the utility of these tools for obtaining physiologic data. In this work, we demonstrate the first success in optically tracking histamine levels in vivo using a modular, injectable sensing platform based on diamine oxidase and a phosphorescent oxygen nanosensor. Our new approach increases the range of measurable analytes by combining an enzymatic recognition element with a reversible nanosensor capable of measuring the effects of enzymatic activity. We use these enzyme nanosensors (EnzNS) to monitor the in vivo histamine dynamics as the concentration rapidly increases and decreases due to administration and clearance. The EnzNS system measured kinetics that match those reported from ex vivo measurements. This work establishes a modular approach to in vivo nanosensor design for measuring a broad range of potential target analytes. Simply replacing the recognition enzyme, or both the enzyme and nanosensor, can produce a new sensor system capable of measuring a wide range of specific analytical targets in vivo.
Pure-rotational spectrometry: a vintage analytical method applied to modern breath analysis.
Hrubesh, Lawrence W; Droege, Michael W
2013-09-01
Pure-rotational spectrometry (PRS) is an established method, typically used to study structures and properties of polar gas-phase molecules, including isotopic and isomeric varieties. PRS has also been used as an analytical tool where it is particularly well suited for detecting or monitoring low-molecular-weight species that are found in exhaled breath. PRS is principally notable for its ultra-high spectral resolution which leads to exceptional specificity to identify molecular compounds in complex mixtures. Recent developments using carbon aerogel for pre-concentrating polar molecules from air samples have extended the sensitivity of PRS into the part-per-billion range. In this paper we describe the principles of PRS and show how it may be configured in several different modes for breath analysis. We discuss the pre-concentration concept and demonstrate its use with the PRS analyzer for alcohols and ammonia sampled directly from the breath.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam
Terada, K.; Ninomiya, K.; Osawa, T.; Tachibana, S.; Miyake, Y.; Kubo, M. K.; Kawamura, N.; Higemoto, W.; Tsuchiyama, A.; Ebihara, M.; Uesugi, M.
2014-01-01
The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (106 s−1 for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ− capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples. PMID:24861282
XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.
2009-01-01
Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.
A new X-ray fluorescence spectroscopy for extraterrestrial materials using a muon beam.
Terada, K; Ninomiya, K; Osawa, T; Tachibana, S; Miyake, Y; Kubo, M K; Kawamura, N; Higemoto, W; Tsuchiyama, A; Ebihara, M; Uesugi, M
2014-05-27
The recent development of the intense pulsed muon source at J-PARC MUSE, Japan Proton Accelerator Research Complex/MUon Science Establishment (10(6) s(-1) for a momentum of 60 MeV/c), enabled us to pioneer a new frontier in analytical sciences. Here, we report a non-destructive elemental analysis using µ(-) capture. Controlling muon momentum from 32.5 to 57.5 MeV/c, we successfully demonstrate a depth-profile analysis of light elements (B, C, N, and O) from several mm-thick layered materials and non-destructive bulk analyses of meteorites containing organic materials. Muon beam analysis, enabling a bulk analysis of light to heavy elements without severe radioactivation, is a unique analytical method complementary to other non-destructive analyses. Furthermore, this technology can be used as a powerful tool to identify the content and distribution of organic components in future asteroidal return samples.
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
2011-01-01
Background Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Methods Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Results Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. Conclusion The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections. PMID:21943385
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
Code of Federal Regulations, 2011 CFR
2011-01-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
Failure Assessment of Brazed Structures
NASA Technical Reports Server (NTRS)
Flom, Yuri
2012-01-01
Despite the great advances in analytical methods available to structural engineers, designers of brazed structures have great difficulties in addressing fundamental questions related to the loadcarrying capabilities of brazed assemblies. In this chapter we will review why such common engineering tools as Finite Element Analysis (FEA) as well as many well-established theories (Tresca, von Mises, Highest Principal Stress, etc) don't work well for the brazed joints. This chapter will show how the classic approach of using interaction equations and the less known Coulomb-Mohr failure criterion can be employed to estimate Margins of Safety (MS) in brazed joints.
Current Research in Aircraft Tire Design and Performance
NASA Technical Reports Server (NTRS)
Tanner, J. A.; Mccarthy, J. L.; Clark, S. K.
1981-01-01
A review of the tire research programs which address the various needs identified by landing gear designers and airplane users is presented. The experimental programs are designed to increase tire tread lifetimes, relate static and dynamic tire properties, establish the tire hydroplaning spin up speed, study gear response to tire failures, and define tire temperature profiles during taxi, braking, and cornering operations. The analytical programs are aimed at providing insights into the mechanisms of heat generation in rolling tires and developing the tools necessary to streamline the tire design process and to aid in the analysis of landing gear problems.
Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham
2007-01-01
This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.
EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.
EMERGING POLLUTANTS, MASS SPECTROMETRY, AND ...
Historically fundamental to amassing our understanding of environmental processes and chemical pollution is the realm of mass spectrometry (MS) - the mainstay of analytical chemistry - the workhorse that supplies definitive data that environmental scientists and engineers reply upon for identifying molecular compositions (and ultimately structures) of chemicals. While the power of MS has long been visible to the practicing environmental chemist, it borders on obscurity to the lay public and many scientists. While MS has played a long, historic (and largely invisible) role in establishing our knowledge of environmental processes and pollution, what recognition it does enjoy is usually relegated to that of a tool. It is usually the relevance or significance of the knowledge acquired from the application of the tool that has ultimate meaning to the public and science at large - not how the data were acquired. Methods (736/800): Mass Spectrometry and the
Logic flowgraph methodology - A tool for modeling embedded systems
NASA Technical Reports Server (NTRS)
Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.
1991-01-01
The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
Combined sensing platform for advanced diagnostics in exhaled mouse breath
NASA Astrophysics Data System (ADS)
Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris
2013-03-01
Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.
Total Quality Management (TQM), an Overview
1991-09-01
Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
On Nonlinear Combustion Instability in Liquid Propellant Rocket Motors
NASA Technical Reports Server (NTRS)
Sims, J. D. (Technical Monitor); Flandro, Gary A.; Majdalani, Joseph; Sims, Joseph D.
2004-01-01
All liquid propellant rocket instability calculations in current use have limited value in the predictive sense and serve mainly as a correlating framework for the available data sets. The well-known n-t model first introduced by Crocco and Cheng in 1956 is still used as the primary analytical tool of this type. A multitude of attempts to establish practical analytical methods have achieved only limited success. These methods usually produce only stability boundary maps that are of little use in making critical design decisions in new motor development programs. Recent progress in understanding the mechanisms of combustion instability in solid propellant rockets"' provides a firm foundation for a new approach to prediction, diagnosis, and correction of the closely related problems in liquid motor instability. For predictive tools to be useful in the motor design process, they must have the capability to accurately determine: 1) time evolution of the pressure oscillations and limit amplitude, 2) critical triggering pulse amplitude, and 3) unsteady heat transfer rates at injector surfaces and chamber walls. The method described in this paper relates these critical motor characteristics directly to system design parameters. Inclusion of mechanisms such as wave steepening, vorticity production and transport, and unsteady detonation wave phenomena greatly enhance the representation of key features of motor chamber oscillatory behavior. The basic theoretical model is described and preliminary computations are compared to experimental data. A plan to develop the new predictive method into a comprehensive analysis tool is also described.
Aeroelastic Optimization Study Based on the X-56A Model
NASA Technical Reports Server (NTRS)
Li, Wesley W.; Pak, Chan-Gi
2014-01-01
One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.
NASA Technical Reports Server (NTRS)
Pandya, Abhilash; Maida, James; Hasson, Scott; Greenisen, Michael; Woolford, Barbara
1993-01-01
As manned exploration of space continues, analytical evaluation of human strength characteristics is critical. These extraterrestrial environments will spawn issues of human performance which will impact the designs of tools, work spaces, and space vehicles. Computer modeling is an effective method of correlating human biomechanical and anthropometric data with models of space structures and human work spaces. The aim of this study is to provide biomechanical data from isolated joints to be utilized in a computer modeling system for calculating torque resulting from any upper extremity motions: in this study, the ratchet wrench push-pull operation (a typical extravehicular activity task). Established here are mathematical relationships used to calculate maximum torque production of isolated upper extremity joints. These relationships are a function of joint angle and joint velocity.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, David S.
Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.
DOT National Transportation Integrated Search
2015-01-01
The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...
Gil, F; Hernández, A F
2015-06-01
Human biomonitoring has become an important tool for the assessment of internal doses of metallic and metalloid elements. These elements are of great significance because of their toxic properties and wide distribution in environmental compartments. Although blood and urine are the most used and accepted matrices for human biomonitoring, other non-conventional samples (saliva, placenta, meconium, hair, nails, teeth, breast milk) may have practical advantages and would provide additional information on health risk. Nevertheless, the analysis of these compounds in biological matrices other than blood and urine has not yet been accepted as a useful tool for biomonitoring. The validation of analytical procedures is absolutely necessary for a proper implementation of non-conventional samples in biomonitoring programs. However, the lack of reliable and useful analytical methodologies to assess exposure to metallic elements, and the potential interference of external contamination and variation in biological features of non-conventional samples are important limitations for setting health-based reference values. The influence of potential confounding factors on metallic concentration should always be considered. More research is needed to ascertain whether or not non-conventional matrices offer definitive advantages over the traditional samples and to broaden the available database for establishing worldwide accepted reference values in non-exposed populations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
"EMERGING" POLLUTANTS, MASS SPECTROMETRY, AND ...
A foundation for Environmental Science - Mass Spectrometry: Historically fundamental to amassing our understanding of environmental processes and chemical pollution is the realm of mass spectrometry - the mainstay of analytical chemistry - the workhorse that supplies much of the definitive data that environmental scientists rely upon for identifying the molecular compositions (and ultimately the structures) of chemicals. This is not to ignore the complementary, critical roles played by the adjunct practices of sample enrichment (via any of various means of selective extraction) and analyte separation (via the myriad forms of chromatography and electrophoresis).While the power of mass spectrometry has long been highly visible to the practicing environmental chemist, it borders on continued obscurity to the lay public and most non-chemists. Even though mass spectrometry has played a long, historic (and largely invisible) role in establishing or undergirdidng our existing knowledge about environmental processes and pollution, what recognition it does enjoy is usually relegated to that of a tool. It is ususally the relevance of ssignificance of the knowledge acquired from the application of the tool that has ultimate meaning to the public and science at large - not how the knowledge was acquired. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in
Perera, Piyumali K.; Gasser, Robin B.; Firestone, Simon M.; Smith, Lee; Roeber, Florian
2014-01-01
Oriental theileriosis is an emerging, tick-borne disease of bovines in the Asia-Pacific region and is caused by one or more genotypes of the Theileria orientalis complex. This study aimed to establish and validate a multiplexed tandem PCR (MT-PCR) assay using three distinct markers (major piroplasm surface protein, 23-kDa piroplasm membrane protein, and the first internal transcribed spacer of nuclear DNA), for the simultaneous detection and semiquantification of four genotypes (Buffeli, Chitose, Ikeda, and type 5) of the T. orientalis complex. Analytical specificity, analytical sensitivity, and repeatability of the established MT-PCR assay were assessed in a series of experiments. Subsequently, the assay was evaluated using 200 genomic DNA samples collected from cattle from farms on which oriental theileriosis outbreaks had occurred, and 110 samples from a region where no outbreaks had been reported. The results showed the MT-PCR assay specifically and reproducibly detected the expected genotypes (i.e., genotypes Buffeli, Chitose, Ikeda, and type 5) of the T. orientalis complex, reliably differentiated them, and was able to detect as little as 1 fg of genomic DNA from each genotype. The diagnostic specificity and sensitivity of the MT-PCR were estimated at 94.0% and 98.8%, respectively. The MT-PCR assay established here is a practical and effective diagnostic tool for the four main genotypes of T. orientalis complex in Australia and should assist studies of the epidemiology and pathophysiology of oriental theileriosis in the Asia-Pacific region. PMID:25339402
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation
FPI: FM Success through Analytics
ERIC Educational Resources Information Center
Hickling, Duane
2013-01-01
The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
Waaijer, Cathelijn J F; Palmblad, Magnus
2015-01-01
In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.
Biokinetics of Nanomaterials: the Role of Biopersistence.
Laux, Peter; Riebeling, Christian; Booth, Andy M; Brain, Joseph D; Brunner, Josephine; Cerrillo, Cristina; Creutzenberg, Otto; Estrela-Lopis, Irina; Gebel, Thomas; Johanson, Gunnar; Jungnickel, Harald; Kock, Heiko; Tentschert, Jutta; Tlili, Ahmed; Schäffer, Andreas; Sips, Adriënne J A M; Yokel, Robert A; Luch, Andreas
2017-04-01
Nanotechnology risk management strategies and environmental regulations continue to rely on hazard and exposure assessment protocols developed for bulk materials, including larger size particles, while commercial application of nanomaterials (NMs) increases. In order to support and corroborate risk assessment of NMs for workers, consumers, and the environment it is crucial to establish the impact of biopersistence of NMs at realistic doses. In the future, such data will allow a more refined future categorization of NMs. Despite many experiments on NM characterization and numerous in vitro and in vivo studies, several questions remain unanswered including the influence of biopersistence on the toxicity of NMs. It is unclear which criteria to apply to characterize a NM as biopersistent. Detection and quantification of NMs, especially determination of their state, i.e., dissolution, aggregation, and agglomeration within biological matrices and other environments are still challenging tasks; moreover mechanisms of nanoparticle (NP) translocation and persistence remain critical gaps. This review summarizes the current understanding of NM biokinetics focusing on determinants of biopersistence. Thorough particle characterization in different exposure scenarios and biological matrices requires use of suitable analytical methods and is a prerequisite to understand biopersistence and for the development of appropriate dosimetry. Analytical tools that potentially can facilitate elucidation of key NM characteristics, such as ion beam microscopy (IBM) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), are discussed in relation to their potential to advance the understanding of biopersistent NM kinetics. We conclude that a major requirement for future nanosafety research is the development and application of analytical tools to characterize NPs in different exposure scenarios and biological matrices.
Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Shou-ping; Xin, Xiao-kang
2017-07-01
Identification of pollutant sources for river pollution incidents is an important and difficult task in the emergency rescue, and an intelligent optimization method can effectively compensate for the weakness of traditional methods. An intelligent model for pollutant source identification has been established using the basic genetic algorithm (BGA) as an optimization search tool and applying an analytic solution formula of one-dimensional unsteady water quality equation to construct the objective function. Experimental tests show that the identification model is effective and efficient: the model can accurately figure out the pollutant amounts or positions no matter single pollution source or multiple sources. Especially when the population size of BGA is set as 10, the computing results are sound agree with analytic results for a single source amount and position identification, the relative errors are no more than 5 %. For cases of multi-point sources and multi-variable, there are some errors in computing results for the reasons that there exist many possible combinations of the pollution sources. But, with the help of previous experience to narrow the search scope, the relative errors of the identification results are less than 5 %, which proves the established source identification model can be used to direct emergency responses.
Development of computer-based analytical tool for assessing physical protection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less
Control/structure interaction conceptual design tool
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1990-01-01
The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.
Development of computer-based analytical tool for assessing physical protection system
NASA Astrophysics Data System (ADS)
Mardhi, Alim; Pengvanich, Phongphaeth
2016-01-01
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.
Active controls: A look at analytical methods and associated tools
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.
1984-01-01
A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.
Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.
Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery
Using Learning Analytics to Support Engagement in Collaborative Writing
ERIC Educational Resources Information Center
Liu, Ming; Pardo, Abelardo; Liu, Li
2017-01-01
Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…
Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis
ERIC Educational Resources Information Center
Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay
2018-01-01
Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…
Hrubec, Terry C.; Smith, Stephen A.; Robertson, John L.
2001-01-01
Hybrid striped bass (Morone chrysops X Morone saxatilis ) are an important aquaculture species yet there are few diagnostic tools available to assess their health. Hematology and clinical chemistry analyses are not used extensively in fish medicine due to the lack of reference intervals for various fish species, and because factors such as age can affect blood values. There is little published information regarding age-related changes in blood values of juvenile fish. It is important to evaluate juvenile fish, as this is the time they are raised in aquaculture settings. Determining age-related changes in the blood values of fishes would further develop clinical pathology as a diagnostic tool, enhancing both fish medicine and the aquaculture industry. The results of standard hematology and clinical chemistry analysis were evaluated in juvenile hybrid striped bass at 4, 6, 9, 15, and 19 months of age. Values for PCV and RBC indices were significantly lower, and plasma protein concentration was significantly higher in younger fish. Total WBC and lymphocyte counts were significantly higher in fish at 6 and 9 months of age, while neutrophil and monocyte counts were higher at 6, 9, and 15 months. Eosinophil counts were significantly higher in 9-month-old fish. The majority of hematologic values fell within previously established reference intervals, indicating that only slight modification to the intervals is necessary for evaluating hematologic results of hybrid striped bass at different ages. The following analytes deviated sufficiently from adult reference intervals to warrant separate reference values: plasma protein concentration at 4 months, WBC and lymphocyte counts at 15 and 19 months, and thrombocyte-like-cells at 9 months of age. Values for most biochemical analytes were significantly different among age groups except for creatinine and potassium concentrations. Comparisons with reference intervals were not made for biochemical analytes, because established reference intervals were not available. Age-related changes in hematologic and biochemical values of striped bass were similar to those reported for rainbow trout and mammals.
ERIC Educational Resources Information Center
Kelly, Nick; Thompson, Kate; Yeoman, Pippa
2015-01-01
This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
[Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].
Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou
2014-08-01
In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).
Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier
2008-05-28
Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.
Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples
2012-01-01
Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466
A spectral Poisson solver for kinetic plasma simulation
NASA Astrophysics Data System (ADS)
Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf
2011-10-01
Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.
NASA Technical Reports Server (NTRS)
Harvill, W. E.; Kizer, J. A.
1976-01-01
The advantageous structural uses of advanced filamentary composites are demonstrated by design, fabrication, and test of three boron-epoxy reinforced C-130 center wing boxes. The advanced development work necessary to support detailed design of a composite reinforced C-130 center wing box was conducted. Activities included the development of a basis for structural design, selection and verification of materials and processes, manufacturing and tooling development, and fabrication and test of full-scale portions of the center wing box. Detailed design drawings, and necessary analytical structural substantiation including static strength, fatigue endurance, flutter, and weight analyses are considered. Some additional component testing was conducted to verify the design for panel buckling, and to evaluate specific local design areas. Development of the cool tool restraint concept was completed, and bonding capabilities were evaluated using full-length skin panel and stringer specimens.
Inferring subunit stoichiometry from single molecule photobleaching
2013-01-01
Single molecule photobleaching is a powerful tool for determining the stoichiometry of protein complexes. By attaching fluorophores to proteins of interest, the number of associated subunits in a complex can be deduced by imaging single molecules and counting fluorophore photobleaching steps. Because some bleaching steps might be unobserved, the ensemble of steps will be binomially distributed. In this work, it is shown that inferring the true composition of a complex from such data is nontrivial because binomially distributed observations present an ill-posed inference problem. That is, a unique and optimal estimate of the relevant parameters cannot be extracted from the observations. Because of this, a method has not been firmly established to quantify confidence when using this technique. This paper presents a general inference model for interpreting such data and provides methods for accurately estimating parameter confidence. The formalization and methods presented here provide a rigorous analytical basis for this pervasive experimental tool. PMID:23712552
Determination of fragrance content in perfume by Raman spectroscopy and multivariate calibration
NASA Astrophysics Data System (ADS)
Godinho, Robson B.; Santos, Mauricio C.; Poppi, Ronei J.
2016-03-01
An alternative methodology is herein proposed for determination of fragrance content in perfumes and their classification according to the guidelines established by fine perfume manufacturers. The methodology is based on Raman spectroscopy associated with multivariate calibration, allowing the determination of fragrance content in a fast, nondestructive, and sustainable manner. The results were considered consistent with the conventional method, whose standard error of prediction values was lower than the 1.0%. This result indicates that the proposed technology is a feasible analytical tool for determination of the fragrance content in a hydro-alcoholic solution for use in manufacturing, quality control and regulatory agencies.
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
Muellner, Ulrich J; Vial, Flavie; Wohlfender, Franziska; Hadorn, Daniela; Reist, Martin; Muellner, Petra
2015-01-01
The reporting of outputs from health surveillance systems should be done in a near real-time and interactive manner in order to provide decision makers with powerful means to identify, assess, and manage health hazards as early and efficiently as possible. While this is currently rarely the case in veterinary public health surveillance, reporting tools do exist for the visual exploration and interactive interrogation of health data. In this work, we used tools freely available from the Google Maps and Charts library to develop a web application reporting health-related data derived from slaughterhouse surveillance and from a newly established web-based equine surveillance system in Switzerland. Both sets of tools allowed entry-level usage without or with minimal programing skills while being flexible enough to cater for more complex scenarios for users with greater programing skills. In particular, interfaces linking statistical softwares and Google tools provide additional analytical functionality (such as algorithms for the detection of unusually high case occurrences) for inclusion in the reporting process. We show that such powerful approaches could improve timely dissemination and communication of technical information to decision makers and other stakeholders and could foster the early-warning capacity of animal health surveillance systems.
Danoix, F; Grancher, G; Bostel, A; Blavette, D
2007-09-01
Atom probe is a very powerful instrument to measure concentrations on a sub nanometric scale [M.K. Miller, G.D.W. Smith, Atom Probe Microanalysis, Principles and Applications to Materials Problems, Materials Research Society, Pittsburgh, 1989]. Atom probe is therefore a unique tool to study and characterise finely decomposed metallic materials. Composition profiles or 3D mapping can be realised by gathering elemental composition measurements. As the detector efficiency is generally not equal to 1, the measured compositions are only estimates of actual values. The variance of the estimates depends on which information is to be estimated. It can be calculated when the detection process is known. These two papers are devoted to give complete analytical derivation and expressions of the variance on composition measurements in several situations encountered when using atom probe. In the first paper, we will concentrate on the analytical derivation of the variance when estimation of compositions obtained from a conventional one dimension (1D) atom probe is considered. In particular, the existing expressions, and the basic hypotheses on which they rely, will be reconsidered, and complete analytical demonstrations established. In the second companion paper, the case of 3D atom probe will be treated, highlighting how the knowledge of the 3D position of detected ions modifies the analytical derivation of the variance of local composition data.
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bekdemir, Ahmet; Stellacci, Francesco
2016-10-01
Nanomedicine requires in-depth knowledge of nanoparticle-protein interactions. These interactions are studied with methods limited to large or fluorescently labelled nanoparticles as they rely on scattering or fluorescence-correlation signals. Here, we have developed a method based on analytical ultracentrifugation (AUC) as an absorbance-based, label-free tool to determine dissociation constants (KD), stoichiometry (Nmax), and Hill coefficient (n), for the association of bovine serum albumin (BSA) with gold nanoparticles. Absorption at 520 nm in AUC renders the measurements insensitive to unbound and aggregated proteins. Measurements remain accurate and do not become more challenging for small (sub-10 nm) nanoparticles. In AUC, frictional ratio analysis allows for the qualitative assessment of the shape of the analyte. Data suggests that small-nanoparticles/protein complexes significantly deviate from a spherical shape even at maximum coverage. We believe that this method could become one of the established approaches for the characterization of the interaction of (small) nanoparticles with proteins.
NASA Astrophysics Data System (ADS)
Kerst, Stijn; Shyrokau, Barys; Holweg, Edward
2018-05-01
This paper proposes a novel semi-analytical bearing model addressing flexibility of the bearing outer race structure. It furthermore presents the application of this model in a bearing load condition monitoring approach. The bearing model is developed as current computational low cost bearing models fail to provide an accurate description of the more and more common flexible size and weight optimized bearing designs due to their assumptions of rigidity. In the proposed bearing model raceway flexibility is described by the use of static deformation shapes. The excitation of the deformation shapes is calculated based on the modelled rolling element loads and a Fourier series based compliance approximation. The resulting model is computational low cost and provides an accurate description of the rolling element loads for flexible outer raceway structures. The latter is validated by a simulation-based comparison study with a well-established bearing simulation software tool. An experimental study finally shows the potential of the proposed model in a bearing load monitoring approach.
μ-PADs for detection of chemical warfare agents.
Pardasani, Deepak; Tak, Vijay; Purohit, Ajay K; Dubey, D K
2012-12-07
Conventional methods of detection of chemical warfare agents (CWAs) based on chromogenic reactions are time and solvent intensive. The development of cost, time and solvent effective microfluidic paper based analytical devices (μ-PADs) for the detection of nerve and vesicant agents is described. The detection of analytes was based upon their reactions with rhodamine hydroxamate and para-nitrobenzyl pyridine, producing red and blue colours respectively. Reactions were optimized on the μ-PADs to produce the limits of detection (LODs) as low as 100 μM for sulfur mustard in aqueous samples. Results were quantified with the help of a simple desktop scanner and Photoshop software. Sarin achieved a linear response in the two concentration ranges of 20-100 mM and 100-500 mM, whereas the response of sulfur mustard was found to be linear in the concentration range of 10-75 mM. Results were precise enough to establish the μ-PADs as a valuable tool for security personnel fighting against chemical terrorism.
Application of wavelet packet transform to compressing Raman spectra data
NASA Astrophysics Data System (ADS)
Chen, Chen; Peng, Fei; Cheng, Qinghua; Xu, Dahai
2008-12-01
Abstract The Wavelet transform has been established with the Fourier transform as a data-processing method in analytical fields. The main fields of application are related to de-noising, compression, variable reduction, and signal suppression. Raman spectroscopy (RS) is characterized by the frequency excursion that can show the information of molecule. Every substance has its own feature Raman spectroscopy, which can analyze the structure, components, concentrations and some other properties of samples easily. RS is a powerful analytical tool for detection and identification. There are many databases of RS. But the data of Raman spectrum needs large space to storing and long time to searching. In this paper, Wavelet packet is chosen to compress Raman spectra data of some benzene series. The obtained results show that the energy retained is as high as 99.9% after compression, while the percentage for number of zeros is 87.50%. It was concluded that the Wavelet packet has significance in compressing the RS data.
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
NASA Technical Reports Server (NTRS)
Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.
2003-01-01
The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system that produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.
NASA Technical Reports Server (NTRS)
Andreadis, Dean; Drake, Alan; Garrett, Joseph L.; Gettinger, Christopher D.; Hoxie, Stephen S.
2002-01-01
The development and ground test of a rocket-based combined cycle (RBCC) propulsion system is being conducted as part of the NASA Marshall Space Flight Center (MSFC) Integrated System Test of an Airbreathing Rocket (ISTAR) program. The eventual flight vehicle (X-43B) is designed to support an air-launched self-powered Mach 0.7 to 7.0 demonstration of an RBCC engine through all of its airbreathing propulsion modes - air augmented rocket (AAR), ramjet (RJ), and scramjet (SJ). Through the use of analytical tools, numerical simulations, and experimental tests the ISTAR program is developing and validating a hydrocarbon-fueled RBCC combustor design methodology. This methodology will then be used to design an integrated RBCC propulsion system thai: produces robust ignition and combustion stability characteristics while maximizing combustion efficiency and minimizing drag losses. First order analytical and numerical methods used to design hydrocarbon-fueled combustors are discussed with emphasis on the methods and determination of requirements necessary to establish engine operability and performance characteristics.
QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa
2016-10-01
Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.
Zhang, Chun-Yun; Chai, Xin-Sheng
2015-03-13
A novel method for the determination of the diffusion coefficient (D) of methanol in water and olive oil has been developed. Based on multiple headspace extraction gas chromatography (MHE-GC), the methanol released from the liquid sample of interest in a closed sample vial was determined in a stepwise fashion. A theoretical model was derived to establish the relationship between the diffusion coefficient and the GC signals from MHE-GC measurements. The results showed that the present method has an excellent precision (RSD<1%) in the linear fitting procedure and good accuracy for the diffusion coefficients of methanol in both water and olive oil, when compared with data reported in the literature. The present method is simple and practical and can be a valuable tool for the determination of the diffusion coefficient of volatile analyte(s) into food simulants from food and beverage packaging material, both in research studies and in actual applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Multi-scale simulations of droplets in generic time-dependent flows
NASA Astrophysics Data System (ADS)
Milan, Felix; Biferale, Luca; Sbragaglia, Mauro; Toschi, Federico
2017-11-01
We study the deformation and dynamics of droplets in time-dependent flows using a diffuse interface model for two immiscible fluids. The numerical simulations are at first benchmarked against analytical results of steady droplet deformation, and further extended to the more interesting case of time-dependent flows. The results of these time-dependent numerical simulations are compared against analytical models available in the literature, which assume the droplet shape to be an ellipsoid at all times, with time-dependent major and minor axis. In particular we investigate the time-dependent deformation of a confined droplet in an oscillating Couette flow for the entire capillary range until droplet break-up. In this way these multi component simulations prove to be a useful tool to establish from ``first principles'' the dynamics of droplets in complex flows involving multiple scales. European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie Grant Agreement No 642069. & European Research Council under the European Community's Seventh Framework Program, ERC Grant Agreement No 339032.
Piccirilli, Gisela N; Escandar, Graciela M
2006-09-01
This paper demonstrates for the first time the power of a chemometric second-order algorithm for predicting, in a simple way and using spectrofluorimetric data, the concentration of analytes in the presence of both the inner-filter effect and unsuspected species. The simultaneous determination of the systemic fungicides carbendazim and thiabendazole was achieved and employed for the discussion of the scopes of the applied second-order chemometric tools: parallel factor analysis (PARAFAC) and partial least-squares with residual bilinearization (PLS/RBL). The chemometric study was performed using fluorescence excitation-emission matrices obtained after the extraction of the analytes over a C18-membrane surface. The ability of PLS/RBL to recognize and overcome the significant changes produced by thiabendazole in both the excitation and emission spectra of carbendazim is demonstrated. The high performance of the selected PLS/RBL method was established with the determination of both pesticides in artificial and real samples.
Toward best practice: leveraging the electronic patient record as a clinical data warehouse.
Ledbetter, C S; Morgan, M W
2001-01-01
Automating clinical and administrative processes via an electronic patient record (EPR) gives clinicians the point-of-care tools they need to deliver better patient care. However, to improve clinical practice as a whole and then evaluate it, healthcare must go beyond basic automation and convert EPR data into aggregated, multidimensional information. Unfortunately, few EPR systems have the established, powerful analytical clinical data warehouses (CDWs) required for this conversion. This article describes how an organization can support best practice by leveraging a CDW that is fully integrated into its EPR and clinical decision support (CDS) system. The article (1) discusses the requirements for comprehensive CDS, including on-line analytical processing (OLAP) of data at both transactional and aggregate levels, (2) suggests that the transactional data acquired by an OLTP EPR system must be remodeled to support retrospective, population-based, aggregate analysis of those data, and (3) concludes that this aggregate analysis is best provided by a separate CDW system.
Principles and applications of Raman spectroscopy in pharmaceutical drug discovery and development.
Gala, Urvi; Chauhan, Harsh
2015-02-01
In recent years, Raman spectroscopy has become increasingly important as an analytical technique in various scientific areas of research and development. This is partly due to the technological advancements in Raman instrumentation and partly due to detailed fingerprinting that can be derived from Raman spectra. Its versatility of applications, rapidness of collection and easy analysis have made Raman spectroscopy an attractive analytical tool. The following review describes Raman spectroscopy and its application within the pharmaceutical industry. The authors explain the theory of Raman scattering and its variations in Raman spectroscopy. The authors also highlight how Raman spectra are interpreted, providing examples. Raman spectroscopy has a number of potential applications within drug discovery and development. It can be used to estimate the molecular activity of drugs and to establish a drug's physicochemical properties such as its partition coefficient. It can also be used in compatibility studies during the drug formulation process. Raman spectroscopy's immense potential should be further investigated in future.
Hori, Katsuhito; Matsubara, Atsuki; Uchikata, Takato; Tsumura, Kazunobu; Fukusaki, Eiichiro; Bamba, Takeshi
2012-08-10
We have established a high-throughput and sensitive analytical method based on supercritical fluid chromatography (SFC) coupled with triple quadrupole mass spectrometry (QqQ MS) for 3-monochloropropane-1,2-diol (3-MCPD) fatty acid esters in edible oils. All analytes were successfully separated within 9 min without sample purification. The system was precise and sensitive, with a limit of detection less than 0.063 mg/kg. The recovery rate of 3-MCPD fatty acid esters spiked into oil samples was in the range of 62.68-115.23%. Furthermore, several edible oils were tested for analyzing 3-MCPD fatty acid ester profiles. This is the first report on the analysis of 3-MCPD fatty acid esters by SFC/QqQ MS. The developed method will be a powerful tool for investigating 3-MCPD fatty acid esters in edible oils. Copyright © 2012 Elsevier B.V. All rights reserved.
Lagrangian based methods for coherent structure detection
NASA Astrophysics Data System (ADS)
Allshouse, Michael R.; Peacock, Thomas
2015-09-01
There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other two approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring
Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia
2010-01-01
The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551
Code of Federal Regulations, 2014 CFR
2014-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
(Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research
ERIC Educational Resources Information Center
Quiñones, Sandra
2016-01-01
Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…
NASA Technical Reports Server (NTRS)
Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.
1973-01-01
Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.
Challenges and Opportunities in Analysing Students Modelling
ERIC Educational Resources Information Center
Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín
2017-01-01
Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…
ERIC Educational Resources Information Center
Reinholz, Daniel L.; Shah, Niral
2018-01-01
Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.
2012-12-01
The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.
Visual programming for next-generation sequencing data analytics.
Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia
2016-01-01
High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.
Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon
2015-01-01
Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Quality Indicators for Learning Analytics
ERIC Educational Resources Information Center
Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus
2014-01-01
This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-06-08
T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less
The Geochemical Databases GEOROC and GeoReM - What's New?
NASA Astrophysics Data System (ADS)
Sarbas, B.; Jochum, K. P.; Nohl, U.; Weis, U.
2017-12-01
The geochemical databases GEOROC (http: georoc.mpch-mainz.gwdg.de) and GeoReM (http: georem.mpch-mainz.gwdg.de) are maintained by the Max Planck Institute for Chemistry in Mainz, Germany. Both online databases became crucial tools for geoscientists from different research areas. They are regularly upgraded by new tools and new data from recent publications obtained from a wide range of international journals. GEOROC is a collection of published analyses of volcanic rocks and mantle xenoliths. Since recently, data for plutonic rocks are added. The analyses include major and trace element concentrations, radiogenic and non-radiogenic isotope ratios as well as analytical ages for whole rocks, glasses, minerals and inclusions. Samples come from eleven geological settings and span the whole geological age scale from Archean to Recent. Metadata include, among others, geographic location, rock class and rock type, geological age, degree of alteration, analytical method, laboratory, and reference. The GEOROC web page allows selection of samples by geological setting, geography, chemical criteria, rock or sample name, and bibliographic criteria. In addition, it provides a large number of precompiled files for individual locations, minerals and rock classes. GeoReM is a database collecting information about reference materials of geological and environmental interest, such as rock powders, synthetic and natural glasses as well as mineral, isotopic, biological, river water and seawater reference materials. It contains published data and compilation values (major and trace element concentrations and mass fractions, radiogenic and stable isotope ratios). Metadata comprise, among others, uncertainty, analytical method and laboratory. Reference materials are important for calibration, method validation, quality control and to establish metrological traceability. GeoReM offers six different search strategies: samples or materials (published values), samples (GeoReM preferred values), chemical criteria, chemical criteria based on bibliography, bibliography, as well as methods and institutions.
ERIC Educational Resources Information Center
Davis, Gary Alan; Woratschek, Charles R.
2015-01-01
Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…
Application of Raman microscopy to biodegradable double-walled microspheres.
Widjaja, Effendi; Lee, Wei Li; Loo, Say Chye Joachim
2010-02-15
Raman mapping measurements were performed on the cross section of the ternary-phase biodegradable double-walled microsphere (DWMS) of poly(D,L-lactide-co-glycolide) (50:50) (PLGA), poly(L-lactide) (PLLA), and poly(epsilon-caprolactone) (PCL), which was fabricated by a one-step solvent evaporation method. The collected Raman spectra were subjected to a band-target entropy minimization (BTEM) algorithm in order to reconstruct the pure component spectra of the species observed in this sample. Seven pure component spectral estimates were recovered, and their spatial distributions within DWMS were determined. The first three spectral estimates were identified as PLLA, PLGA 50:50, and PCL, which were the main components in DWMS. The last four spectral estimates were identified as semicrystalline polyglycolic acid (PGA), dichloromethane (DCM), copper-phthalocyanine blue, and calcite, which were the minor components in DWMS. PGA was the decomposition product of PLGA. DCM was the solvent used in DWMS fabrication. Copper-phthalocyanine blue and calcite were the unexpected contaminants. The current result showed that combined Raman microscopy and BTEM analysis can provide a sensitive characterization tool to DWMS, as it can give more specific information on the chemical species present as well as the spatial distributions. This novel analytical method for microsphere characterization can serve as a complementary tool to other more established analytical techniques, such as scanning electron microscopy and optical microscopy.
100-B/C Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.W. Ovink
2010-03-18
This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.
Cespi, Marco; Perinelli, Diego R; Casettari, Luca; Bonacucina, Giulia; Caporicci, Giuseppe; Rendina, Filippo; Palmieri, Giovanni F
2014-12-30
The use of process analytical technologies (PAT) to ensure final product quality is by now a well established practice in pharmaceutical industry. To date, most of the efforts in this field have focused on development of analytical methods using spectroscopic techniques (i.e., NIR, Raman, etc.). This work evaluated the possibility of using the parameters derived from the processing of in-line raw compaction data (the forces and displacement of the punches) as a PAT tool for controlling the tableting process. To reach this goal, two commercially available formulations were used, changing the quantitative composition and compressing them on a fully instrumented rotary pressing machine. The Heckel yield pressure and the compaction energies, together with the tablets hardness and compaction pressure, were selected and evaluated as discriminating parameters in all the prepared formulations. The apparent yield pressure, as shown in the obtained results, has the necessary sensitivity to be effectively included in a PAT strategy to monitor the tableting process. Additional investigations were performed to understand the criticalities and the mechanisms beyond this performing parameter and the associated implications. Specifically, it was discovered that the efficiency of the apparent yield pressure depends on the nominal drug title, the drug densification mechanism and the error in pycnometric density. In this study, the potential of using some parameters derived from the compaction raw data has been demonstrated to be an attractive alternative and complementary method to the well established spectroscopic techniques to monitor and control the tableting process. The compaction data monitoring method is also easy to set up and very cost effective. Copyright © 2014 Elsevier B.V. All rights reserved.
Sciacovelli, Laura; Panteghini, Mauro; Lippi, Giuseppe; Sumarac, Zorica; Cadamuro, Janne; Galoro, César Alex De Olivera; Pino Castro, Isabel Garcia Del; Shcolnik, Wilson; Plebani, Mario
2017-08-28
The improving quality of laboratory testing requires a deep understanding of the many vulnerable steps involved in the total examination process (TEP), along with the identification of a hierarchy of risks and challenges that need to be addressed. From this perspective, the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) is focusing its activity on implementation of an efficient tool for obtaining meaningful information on the risk of errors developing throughout the TEP, and for establishing reliable information about error frequencies and their distribution. More recently, the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) has created the Task and Finish Group "Performance specifications for the extra-analytical phases" (TFG-PSEP) for defining performance specifications for extra-analytical phases. Both the IFCC and EFLM groups are working to provide laboratories with a system to evaluate their performances and recognize the critical aspects where improvement actions are needed. A Consensus Conference was organized in Padova, Italy, in 2016 in order to bring together all the experts and interested parties to achieve a consensus for effective harmonization of quality indicators (QIs). A general agreement was achieved and the main outcomes have been the release of a new version of model of quality indicators (MQI), the approval of a criterion for establishing performance specifications and the definition of the type of information that should be provided within the report to the clinical laboratories participating to the QIs project.
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
Yasukawa, Keiko; Shimosawa, Tatsuo; Okubo, Shigeo; Yatomi, Yutaka
2018-01-01
Background Human mercaptalbumin and human non-mercaptalbumin have been reported as markers for various pathological conditions, such as kidney and liver diseases. These markers play important roles in redox regulations throughout the body. Despite the recognition of these markers in various pathophysiologic conditions, the measurements of human mercaptalbumin and non-mercaptalbumin have not been popular because of the technical complexity and long measurement time of conventional methods. Methods Based on previous reports, we explored the optimal analytical conditions for a high-performance liquid chromatography method using an anion-exchange column packed with a hydrophilic polyvinyl alcohol gel. The method was then validated using performance tests as well as measurements of various patients' serum samples. Results We successfully established a reliable high-performance liquid chromatography method with an analytical time of only 12 min per test. The repeatability (within-day variability) and reproducibility (day-to-day variability) were 0.30% and 0.27% (CV), respectively. A very good correlation was obtained with the results of the conventional method. Conclusions A practical method for the clinical measurement of human mercaptalbumin and non-mercaptalbumin was established. This high-performance liquid chromatography method is expected to be a powerful tool enabling the expansion of clinical usefulness and ensuring the elucidation of the roles of albumin in redox reactions throughout the human body.
Constructing the collective unconscious.
Gullatz, Stefan
2010-11-01
Innovative attempts at collating Jungian analytical psychology with a range of 'post-modern' theories have yielded significant results. This paper adopts an alternative strategy: a Lacanian vantage point on Jungian theory that eschews an attempt at reconciling Jung with post-structuralism. A focused Lacanian gaze on Jung will establish an irreducible tension between Jung's view of archetypes as factors immanent to the psyche and a Lacanian critique that lays bare the contingent structures and mechanisms of their constitution, unveiling the supposed archetypes'a posteriori production through the efficacy of a discursive field. Theories of ideology developed in the wake of Lacan provide a powerful methodological tool allowing to bring this distinction into focus. An assembly of Lacan's fragmentary accounts of Jung will be supplemented with an approach to Jungian theory via Žižek's Lacan-oriented theory of the signifying mechanism underpinning 'ideology'. Accordingly, the Jungian archetype of the self, which is considered in some depth, can begin to be seen in a new light, namely as a 'master signifier', not only of Jung's academic edifice, but also -and initially-of the discursive strategies that establish his own subjectivity. A discussion of Jung's approach to mythology reveals how the 'quilting point' of his discourse comes to be coupled with a correlate in the Real, a non-discursive 'sublime object' conferring upon archetypes their fascinating aura. © 2010, The Society of Analytical Psychology.
Immunoelectron microscopy in embryos.
Sierralta, W D
2001-05-01
Immunogold labeling of proteins in sections of embryos embedded in acrylate media provides an important analytical tool when the resolving power of the electron microscope is required to define sites of protein function. The protocol presented here was established to analyze the role and dynamics of the activated protein kinase C/Rack1 regulatory system in the patterning and outgrowth of limb bud mesenchyme. With minor changes, especially in the composition of the fixative solution, the protocol should be easily adaptable for the postembedding immunogold labeling of any other antigen in tissues of embryos of diverse species. Quantification of the labeling can be achieved by using electron microscope systems capable of supporting digital image analysis. Copyright 2001 Academic Press.
Proteoglycomics: Recent Progress and Future Challenges
Ly, Mellisa; Laremore, Tatiana N.
2010-01-01
Abstract Proteoglycomics is a systematic study of structure, expression, and function of proteoglycans, a posttranslationally modified subset of a proteome. Although relying on the established technologies of proteomics and glycomics, proteoglycomics research requires unique approaches for elucidating structure–function relationships of both proteoglycan components, glycosaminoglycan chain, and core protein. This review discusses our current understanding of structure and function of proteoglycans, major players in the development, normal physiology, and disease. A brief outline of the proteoglycomic sample preparation and analysis is provided along with examples of several recent proteoglycomic studies. Unique challenges in the characterization of glycosaminoglycan component of proteoglycans are discussed, with emphasis on the many analytical tools used and the types of information they provide. PMID:20450439
Determination of fragrance content in perfume by Raman spectroscopy and multivariate calibration.
Godinho, Robson B; Santos, Mauricio C; Poppi, Ronei J
2016-03-15
An alternative methodology is herein proposed for determination of fragrance content in perfumes and their classification according to the guidelines established by fine perfume manufacturers. The methodology is based on Raman spectroscopy associated with multivariate calibration, allowing the determination of fragrance content in a fast, nondestructive, and sustainable manner. The results were considered consistent with the conventional method, whose standard error of prediction values was lower than the 1.0%. This result indicates that the proposed technology is a feasible analytical tool for determination of the fragrance content in a hydro-alcoholic solution for use in manufacturing, quality control and regulatory agencies. Copyright © 2015 Elsevier B.V. All rights reserved.
Postcolonial nursing scholarship: from epistemology to method.
Kirkham, Sheryl Reiner; Anderson, Joan M
2002-09-01
Postcolonial theory, with its interpretations of race, racialization, and culture, offers nursing scholarship a set of powerful analytic tools unlike those offered by other nursing and social theories. Building on the foundation established by those who first pointed to the importance of incorporating cultural aspects into nursing care, nursing scholarship is in a position to move forward. Critical perspectives such as postcolonialism equip us to meet the epistemological imperative of giving voice to subjugated knowledges and the social mandates of uncovering existing inequities and addressing the social aspects of health and illness. This article makes a case for the integration of postcolonial perspectives into theorizing and sketches out a research methodology based on the postcolonial tradition.
Sun, Jared H; Twomey, Michele; Tran, Jeffrey; Wallis, Lee A
2012-11-01
Ninety percent of emergency incidents occur in developing countries, and this is only expected to get worse as these nations develop. As a result, governments in developing countries are establishing emergency care systems. However, there is currently no widely-usable, objective method to monitor or research the rapid growth of emergency care in the developing world. Analysis of current quantitative methods to assess emergency care in developing countries, and the proposal of a more appropriate method. Currently accepted methods to quantitatively assess the efficacy of emergency care systems cannot be performed in most developing countries due to weak record-keeping infrastructure and the inappropriateness of applying Western derived coefficients to developing country conditions. As a result, although emergency care in the developing world is rapidly growing, researchers and clinicians are unable to objectively measure its progress or determine which policies work best in their respective countries. We propose the TEWS methodology, a simple analytical tool that can be handled by low-resource, developing countries. By relying on the most basic universal parameters, simplest calculations and straightforward protocol, the TEWS methodology allows for widespread analysis of emergency care in the developing world. This could become essential in the establishment and growth of new emergency care systems worldwide.
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Kuby, Candace R.
2014-01-01
An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…
Analytical Tools for Affordability Analysis
2015-05-01
function (Womer) Unit cost as a function of learning and rate Learning with forgetting (Benkard) Learning depreciates over time Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION
Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft
2013-03-01
imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89 B. FUTURE WORK................................................................................. 90 APPENDIX A. STK DATA AND BENEFIT
ERIC Educational Resources Information Center
Kilpatrick, Sue; Field, John; Falk, Ian
The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…
ERIC Educational Resources Information Center
Paranosic, Nikola; Riveros, Augusto
2017-01-01
This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…
Analytical Tools for Behavioral Influences Operations
2003-12-01
NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Smith, Kevin; Raulerson, David; Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Brasche, Lisa
2003-01-01
Tools for Engine Diagnostics is a major task in the Propulsion System Health Management area of the Single Aircraft Accident Prevention project under NASA s Aviation Safety Program. The major goal of the Aviation Safety Program is to reduce fatal aircraft accidents by 80 percent within 10 years and by 90 percent within 25 years. The goal of the Propulsion System Health Management area is to eliminate propulsion system malfunctions as a primary or contributing factor to the cause of aircraft accidents. The purpose of Tools for Engine Diagnostics, a 2-yr-old task, is to establish and improve tools for engine diagnostics and prognostics that measure the deformation and damage of rotating engine components at the ground level and that perform intermittent or continuous monitoring on the engine wing. In this work, nondestructive-evaluation- (NDE-) based technology is combined with model-dependent disk spin experimental simulation systems, like finite element modeling (FEM) and modal norms, to monitor and predict rotor damage in real time. Fracture mechanics time-dependent fatigue crack growth and damage-mechanics-based life estimation are being developed, and their potential use investigated. In addition, wireless eddy current and advanced acoustics are being developed for on-wing and just-in-time NDE engine inspection to provide deeper access and higher sensitivity to extend on-wing capabilities and improve inspection readiness. In the long run, these methods could establish a base for prognostic sensing while an engine is running, without any overt actions, like inspections. This damage-detection strategy includes experimentally acquired vibration-, eddy-current- and capacitance-based displacement measurements and analytically computed FEM-, modal norms-, and conventional rotordynamics-based models of well-defined damages and critical mass imbalances in rotating disks and rotors.
Metabolomics studies in brain tissue: A review.
Gonzalez-Riano, Carolina; Garcia, Antonia; Barbas, Coral
2016-10-25
Brain is still an organ with a composition to be discovered but beyond that, mental disorders and especially all diseases that curse with dementia are devastating for the patient, the family and the society. Metabolomics can offer an alternative tool for unveiling new insights in the discovery of new treatments and biomarkers of mental disorders. Until now, most of metabolomic studies have been based on biofluids: serum/plasma or urine, because brain tissue accessibility is limited to animal models or post mortem studies, but even so it is crucial for understanding the pathological processes. Metabolomics studies of brain tissue imply several challenges due to sample extraction, along with brain heterogeneity, sample storage, and sample treatment for a wide coverage of metabolites with a wide range of concentrations of many lipophilic and some polar compounds. In this review, the current analytical practices for target and non-targeted metabolomics are described and discussed with emphasis on critical aspects: sample treatment (quenching, homogenization, filtration, centrifugation and extraction), analytical methods, as well as findings considering the used strategies. Besides that, the altered analytes in the different brain regions have been associated with their corresponding pathways to obtain a global overview of their dysregulation, trying to establish the link between altered biological pathways and pathophysiological conditions. Copyright © 2016 Elsevier B.V. All rights reserved.
Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh
2014-01-01
Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.
Determination of Ignitable Liquids in Fire Debris: Direct Analysis by Electronic Nose
Ferreiro-González, Marta; Barbero, Gerardo F.; Palma, Miguel; Ayuso, Jesús; Álvarez, José A.; Barroso, Carmelo G.
2016-01-01
Arsonists usually use an accelerant in order to start or accelerate a fire. The most widely used analytical method to determine the presence of such accelerants consists of a pre-concentration step of the ignitable liquid residues followed by chromatographic analysis. A rapid analytical method based on headspace-mass spectrometry electronic nose (E-Nose) has been developed for the analysis of Ignitable Liquid Residues (ILRs). The working conditions for the E-Nose analytical procedure were optimized by studying different fire debris samples. The optimized experimental variables were related to headspace generation, specifically, incubation temperature and incubation time. The optimal conditions were 115 °C and 10 min for these two parameters. Chemometric tools such as hierarchical cluster analysis (HCA) and linear discriminant analysis (LDA) were applied to the MS data (45–200 m/z) to establish the most suitable spectroscopic signals for the discrimination of several ignitable liquids. The optimized method was applied to a set of fire debris samples. In order to simulate post-burn samples several ignitable liquids (gasoline, diesel, citronella, kerosene, paraffin) were used to ignite different substrates (wood, cotton, cork, paper and paperboard). A full discrimination was obtained on using discriminant analysis. This method reported here can be considered as a green technique for fire debris analyses. PMID:27187407
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
ERIC Educational Resources Information Center
Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie
2016-01-01
Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…
Tools for studying dry-cured ham processing by using computed tomography.
Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena
2012-01-11
An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.
Development of Multi-slice Analytical Tool to Support BIM-based Design Process
NASA Astrophysics Data System (ADS)
Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.
2017-03-01
This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
Sustainability Tools Inventory Initial Gap Analysis
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...
Wang, Xiaoli; Knapp, Peter; Vaynman, S; Graham, M E; Cao, Jian; Ulmer, M P
2014-09-20
The desire for continuously gaining new knowledge in astronomy has pushed the frontier of engineering methods to deliver lighter, thinner, higher quality mirrors at an affordable cost for use in an x-ray observatory. To address these needs, we have been investigating the application of magnetic smart materials (MSMs) deposited as a thin film on mirror substrates. MSMs have some interesting properties that make the application of MSMs to mirror substrates a promising solution for making the next generation of x-ray telescopes. Due to the ability to hold a shape with an impressed permanent magnetic field, MSMs have the potential to be the method used to make light weight, affordable x-ray telescope mirrors. This paper presents the experimental setup for measuring the deformation of the magnetostrictive bimorph specimens under an applied magnetic field, and the analytical and numerical analysis of the deformation. As a first step in the development of tools to predict deflections, we deposited Terfenol-D on the glass substrates. We then made measurements that were compared with the results from the analytical and numerical analysis. The surface profiles of thin-film specimens were measured under an external magnetic field with white light interferometry (WLI). The analytical model provides good predictions of film deformation behavior under various magnetic field strengths. This work establishes a solid foundation for further research to analyze the full three-dimensional deformation behavior of magnetostrictive thin films.
New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.
Marek, Lukáš; Tuček, Pavel; Pászto, Vít
2015-01-28
Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.
ERIC Educational Resources Information Center
Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon
2017-01-01
The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…
Visual Information for the Desktop, version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2006-03-29
VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.
ERIC Educational Resources Information Center
Teplovs, Chris
2015-01-01
This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm
NASA Technical Reports Server (NTRS)
Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will
2016-01-01
With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.
Development of a Boundary Layer Property Interpolation Tool in Support of Orbiter Return To Flight
NASA Technical Reports Server (NTRS)
Greene, Francis A.; Hamilton, H. Harris
2006-01-01
A new tool was developed to predict the boundary layer quantities required by several physics-based predictive/analytic methods that assess damaged Orbiter tile. This new tool, the Boundary Layer Property Prediction (BLPROP) tool, supplies boundary layer values used in correlations that determine boundary layer transition onset and surface heating-rate augmentation/attenuation factors inside tile gouges (i.e. cavities). BLPROP interpolates through a database of computed solutions and provides boundary layer and wall data (delta, theta, Re(sub theta)/M(sub e), Re(sub theta)/M(sub e), Re(sub theta), P(sub w), and q(sub w)) based on user input surface location and free stream conditions. Surface locations are limited to the Orbiter s windward surface. Constructed using predictions from an inviscid w/boundary-layer method and benchmark viscous CFD, the computed database covers the hypersonic continuum flight regime based on two reference flight trajectories. First-order one-dimensional Lagrange interpolation accounts for Mach number and angle-of-attack variations, whereas non-dimensional normalization accounts for differences between the reference and input Reynolds number. Employing the same computational methods used to construct the database, solutions at other trajectory points taken from previous STS flights were computed: these results validate the BLPROP algorithm. Percentage differences between interpolated and computed values are presented and are used to establish the level of uncertainty of the new tool.
Analysis of laparoscopy in trauma.
Villavicencio, R T; Aucar, J A
1999-07-01
The optimum roles for laparoscopy in trauma have yet to be established. To date, reviews of laparoscopy in trauma have been primarily descriptive rather than analytic. This article analyzes the results of laparoscopy in trauma. Outcome analysis was done by reviewing 37 studies with more than 1,900 trauma patients, and laparoscopy was analyzed as a screening, diagnostic, or therapeutic tool. Laparoscopy was regarded as a screening tool if it was used to detect or exclude a positive finding (eg, hemoperitoneum, organ injury, gastrointestinal spillage, peritoneal penetration) that required operative exploration or repair. Laparoscopy was regarded as a diagnostic tool when it was used to identify all injuries, rather than as a screening tool to identify the first indication for a laparotomy. It was regarded as a diagnostic tool only in studies that mandated a laparotomy (gold standard) after laparoscopy to confirm the diagnostic accuracy of laparoscopic findings. Costs and charges for using laparoscopy in trauma were analyzed when feasible. As a screening tool, laparoscopy missed 1% of injuries and helped prevent 63% of patients from having a trauma laparotomy. When used as a diagnostic tool, laparoscopy had a 41% to 77% missed injury rate per patient. Overall, laparoscopy carried a 1% procedure-related complication rate. Cost-effectiveness has not been uniformly proved in studies comparing laparoscopy and laparotomy. Laparoscopy has been applied safely and effectively as a screening tool in stable patients with acute trauma. Because of the large number of missed injuries when used as a diagnostic tool, its value in this context is limited. Laparoscopy has been reported infrequently as a therapeutic tool in selected patients, and its use in this context requires further study.
NASA Technical Reports Server (NTRS)
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
ERIC Educational Resources Information Center
Chen, Bodong
2015-01-01
In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
IBM’s Health Analytics and Clinical Decision Support
Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.
2014-01-01
Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736
NASA Astrophysics Data System (ADS)
Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng
2017-12-01
Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.
Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit
2016-03-01
Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Astrophysics Data System (ADS)
Marsik, S. J.; Morea, S. F.
1985-03-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
Lagrangian based methods for coherent structure detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allshouse, Michael R., E-mail: mallshouse@chaos.utexas.edu; Peacock, Thomas, E-mail: tomp@mit.edu
There has been a proliferation in the development of Lagrangian analytical methods for detecting coherent structures in fluid flow transport, yielding a variety of qualitatively different approaches. We present a review of four approaches and demonstrate the utility of these methods via their application to the same sample analytic model, the canonical double-gyre flow, highlighting the pros and cons of each approach. Two of the methods, the geometric and probabilistic approaches, are well established and require velocity field data over the time interval of interest to identify particularly important material lines and surfaces, and influential regions, respectively. The other twomore » approaches, implementing tools from cluster and braid theory, seek coherent structures based on limited trajectory data, attempting to partition the flow transport into distinct regions. All four of these approaches share the common trait that they are objective methods, meaning that their results do not depend on the frame of reference used. For each method, we also present a number of example applications ranging from blood flow and chemical reactions to ocean and atmospheric flows.« less
DNA barcode-based delineation of putative species: efficient start for taxonomic workflows
Kekkonen, Mari; Hebert, Paul D N
2014-01-01
The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435
Flat-plate solar array project. Volume 8: Project analysis and integration
NASA Technical Reports Server (NTRS)
Mcguire, P.; Henry, P.
1986-01-01
Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.
NASA Astrophysics Data System (ADS)
Ilg, Patrick; Evangelopoulos, Apostolos E. A. S.
2018-03-01
While magnetic nanoparticles suspended in Newtonian solvents (ferrofluids) have been intensively studied in recent years, the effects of viscoelasticity of the surrounding medium on the nanoparticle dynamics are much less understood. Here we investigate a mesoscopic model for the orientational dynamics of isolated magnetic nanoparticles subject to external fields, viscous and viscoelastic friction, as well as the corresponding random torques. We solve the model analytically in the overdamped limit for weak viscoelasticity. By comparison to Brownian dynamics simulations we establish the limits of validity of the analytical solution. We find that viscoelasticity not only slows down the magnetization relaxation, shifts the peak of the imaginary magnetic susceptibility χ″ to lower frequencies, and increases the magnetoviscosity but also leads to nonexponential relaxation and a broadening of χ″. The model we study also allows us to test a recent proposal for using magnetic susceptibility measurements as a nanorheological tool using a variant of the Germant-DiMarzio-Bishop relation. We find for the present model and certain parameter ranges that the relation of the magnetic susceptibility to the shear modulus is satisfied to a good approximation.
Commented review of the Colombian legislation regarding the ethics of health research.
Lopera, Mónica María
2017-12-01
The scope of ethics in health research transcends its legal framework and the regulations established in Resolution 8430 of 1993. These norms represent a fundamental tool to determine the minimum protection standards for research subjects, and, therefore, they should be known, applied properly, and reflect upon by all researchers in the field.Here I present and discuss from an analytical point of view the regulations that guide research in health. In this framework, health is understood as a multidimensional process, and research in health as a multidisciplinary exercise involving basic, clinical and public health research, collective health, and other related sciences.The main analytical categories are related to the principles and actors involved in research (regulatory authorities, ethical committees, and special or vulnerable subjects and populations), and to professional ethics codes, in addition to informed consents and data management.Despite the contribution of this legislation to the qualification of health research, my conclusion is that the national legislation in ethics for health research requires updating regarding technological and scientific developments, as well as specifications from the multiple types of health studies.
Golubović, Jelena; Protić, Ana; Otašević, Biljana; Zečević, Mira
2016-04-01
QSRR are mathematically derived relationships between the chromatographic parameters determined for a representative series of analytes in given separation systems and the molecular descriptors accounting for the structural differences among the investigated analytes. Artificial neural network is a technique of data analysis, which sets out to emulate the human brain's way of working. The aim of the present work was to optimize separation of six angiotensin receptor antagonists, so-called sartans: losartan, valsartan, irbesartan, telmisartan, candesartan cilexetil and eprosartan in a gradient-elution HPLC method. For this purpose, ANN as a mathematical tool was used for establishing a QSRR model based on molecular descriptors of sartans and varied instrumental conditions. The optimized model can be further used for prediction of an external congener of sartans and analysis of the influence of the analyte structure, represented through molecular descriptors, on retention behaviour. Molecular descriptors included in modelling were electrostatic, geometrical and quantum-chemical descriptors: connolly solvent excluded volume non-1,4 van der Waals energy, octanol/water distribution coefficient, polarizability, number of proton-donor sites and number of proton-acceptor sites. Varied instrumental conditions were gradient time, buffer pH and buffer molarity. High prediction ability of the optimized network enabled complete separation of the analytes within the run time of 15.5 min under following conditions: gradient time of 12.5 min, buffer pH of 3.95 and buffer molarity of 25 mM. Applied methodology showed the potential to predict retention behaviour of an external analyte with the properties within the training space. Connolly solvent excluded volume, polarizability and number of proton-acceptor sites appeared to be most influential paramateres on retention behaviour of the sartans. Copyright © 2015 Elsevier B.V. All rights reserved.
Gu, Liqiang; Hou, Pengyi; Zhang, Ruowen; Liu, Ziying; Bi, Kaishun; Chen, Xiaohui
2016-10-15
A Previous metabolomics study has demonstrated that tyrosine metabolism might be disrupted by treating with Semen Strychni on the cell nephrotoxicity model. To investigate the relationship between Semen Strychni alkaloids (SAs) and endogenous tyrosine, tyramine under the nephrotoxicity condition, an HILIC-ESI-MS/MS based analytical strategy was applied in this study. Based on the established Semen Strychni nephrotoxicity cell model, strychnine and brucine were identified and screened as the main SAs by an HPLC-Q Exactive hybrid quadrupole Orbitrap mass system. Then, a sensitive HILIC-ESI-MS/MS method was developed to simultaneously monitor strychnine, brucine, tyrosine and tyramine in cell lysate. The analytes were separated by a Shiseido CAPCELL CORE PC (150mm×2.1mm, 2.7μm) HILIC column in an acetonitrile/0.1% formic acid gradient system. All the calibration curves were linear with regression coefficients above 0.9924. The absolute recoveries were more than 80.5% and the matrix effects were between 91.6%-107.0%. With the developed method, analytes were successfully determined in cell lysates. Decreased levels of tyrosine and tyramine were observed only in combination with increased levels of SAs, indicating that the disturbance of tyrosine metabolism might be induced by the accumulation of SAs in kidney cell after exposure of Semen Strychni. The HILIC-ESI-MS/MS based analytical strategy is a useful tool to reveal the relationships between the toxic herb components and the endogenous metabolite profiling in the toxicity investigation of herb medicines. Copyright © 2016 Elsevier B.V. All rights reserved.
Haze Gray Paint and the U.S. Navy: A Procurement Process Review
2017-12-01
support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools
2014-01-14
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and
Visualization and Analytics Software Tools for Peregrine System |
R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel
2011-03-28
Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory
Demonstrating Success: Web Analytics and Continuous Improvement
ERIC Educational Resources Information Center
Loftus, Wayne
2012-01-01
As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, F.H.; Borek, T.T.; Christopher, J.Z.
1997-12-01
Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less
NASA Astrophysics Data System (ADS)
Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty
2014-06-01
Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
Monitoring of antisolvent crystallization of sodium scutellarein by combined FBRM-PVM-NIR.
Liu, Xuesong; Sun, Di; Wang, Feng; Wu, Yongjiang; Chen, Yong; Wang, Longhu
2011-06-01
Antisolvent crystallization can be used as an alternative to cooling or evaporation for the separation and purification of solid product in the pharmaceutical industry. To improve the process understanding of antisolvent crystallization, the use of in-line tools is vital. In this study, the process analytical technology (PAT) tools including focused beam reflectance measurement (FBRM), particle video microscope (PVM), and near-infrared spectroscopy (NIRS) were utilized to monitor antisolvent crystallization of sodium scutellarein. FBRM was used to monitor chord count and chord length distribution of sodium scutellarein particles in the crystallizer, and PVM, as an in-line video camera, provided pictures imaging particle shape and dimension. In addition, a quantitative model of PLS was established by in-line NIRS to detect the concentration of sodium scutellarein in the solvent and good calibration statistics were obtained (r(2) = 0.976) with the residual predictive deviation value of 11.3. The discussion over sensitivities, strengths, and weaknesses of the PAT tools may be helpful in selection of suitable PAT techniques. These in-line techniques eliminate the need for sample preparation and offer a time-saving approach to understand and monitor antisolvent crystallization process. Copyright © 2011 Wiley-Liss, Inc.
Distributed Generation Interconnection Collaborative | NREL
, reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability
Tobiszewski, Marek; Orłowski, Aleksander
2015-03-27
The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Medical cost analysis: application to colorectal cancer data from the SEER Medicare database.
Bang, Heejung
2005-10-01
Incompleteness is a key feature of most survival data. Numerous well established statistical methodologies and algorithms exist for analyzing life or failure time data. However, induced censorship invalidates the use of those standard analytic tools for some survival-type data such as medical costs. In this paper, some valid methods currently available for analyzing censored medical cost data are reviewed. Some cautionary findings under different assumptions are envisioned through application to medical costs from colorectal cancer patients. Cost analysis should be suitably planned and carefully interpreted under various meaningful scenarios even with judiciously selected statistical methods. This approach would be greatly helpful to policy makers who seek to prioritize health care expenditures and to assess the elements of resource use.
The finite element method in low speed aerodynamics
NASA Technical Reports Server (NTRS)
Baker, A. J.; Manhardt, P. D.
1975-01-01
The finite element procedure is shown to be of significant impact in design of the 'computational wind tunnel' for low speed aerodynamics. The uniformity of the mathematical differential equation description, for viscous and/or inviscid, multi-dimensional subsonic flows about practical aerodynamic system configurations, is utilized to establish the general form of the finite element algorithm. Numerical results for inviscid flow analysis, as well as viscous boundary layer, parabolic, and full Navier Stokes flow descriptions verify the capabilities and overall versatility of the fundamental algorithm for aerodynamics. The proven mathematical basis, coupled with the distinct user-orientation features of the computer program embodiment, indicate near-term evolution of a highly useful analytical design tool to support computational configuration studies in low speed aerodynamics.
Evans function computation for the stability of travelling waves
NASA Astrophysics Data System (ADS)
Barker, B.; Humpherys, J.; Lyng, G.; Lytle, J.
2018-04-01
In recent years, the Evans function has become an important tool for the determination of stability of travelling waves. This function, a Wronskian of decaying solutions of the eigenvalue equation, is useful both analytically and computationally for the spectral analysis of the linearized operator about the wave. In particular, Evans-function computation allows one to locate any unstable eigenvalues of the linear operator (if they exist); this allows one to establish spectral stability of a given wave and identify bifurcation points (loss of stability) as model parameters vary. In this paper, we review computational aspects of the Evans function and apply it to multidimensional detonation waves. This article is part of the theme issue `Stability of nonlinear waves and patterns and related topics'.
Strategy for continuous improvement in IC manufacturability, yield, and reliability
NASA Astrophysics Data System (ADS)
Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary
1993-01-01
Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement.
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
Bifurcation theory for finitely smooth planar autonomous differential systems
NASA Astrophysics Data System (ADS)
Han, Maoan; Sheng, Lijuan; Zhang, Xiang
2018-03-01
In this paper we establish bifurcation theory of limit cycles for planar Ck smooth autonomous differential systems, with k ∈ N. The key point is to study the smoothness of bifurcation functions which are basic and important tool on the study of Hopf bifurcation at a fine focus or a center, and of Poincaré bifurcation in a period annulus. We especially study the smoothness of the first order Melnikov function in degenerate Hopf bifurcation at an elementary center. As we know, the smoothness problem was solved for analytic and C∞ differential systems, but it was not tackled for finitely smooth differential systems. Here, we present their optimal regularity of these bifurcation functions and their asymptotic expressions in the finite smooth case.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431
2014-10-20
three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY
Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery
NASA Astrophysics Data System (ADS)
Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.
2017-12-01
Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.
Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS
NASA Astrophysics Data System (ADS)
Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas
2014-05-01
Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected examples on combining isotopic systems for the study of ecosystem processes on different spatial scales will underpin the great opportunities substantiated by the field of analytical ecogeochemistry. Moreover, recent developments in plasma mass spectrometry and the application of new isotopic systems require sound metrological approaches in order to prevent scientific conclusions drawn from analytical artifacts.
Modeling of the Global Water Cycle - Analytical Models
Yongqiang Liu; Roni Avissar
2005-01-01
Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...
NASA Astrophysics Data System (ADS)
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-02-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-01-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping
2017-01-01
Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.
Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.
Dunn, Joshua G; Weissman, Jonathan S
2016-11-22
Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .
2011-01-01
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968
Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin
2011-03-16
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.
Google Analytics – Index of Resources
Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.
Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review
NASA Astrophysics Data System (ADS)
Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal
2017-08-01
Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework
ERIC Educational Resources Information Center
Ranjan, Jayanthi; Bhatnagar, Vishal
2011-01-01
Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…
What Is Trust? Ethics and Risk Governance in Precision Medicine and Predictive Analytics
Adjekum, Afua; Ienca, Marcello
2017-01-01
Abstract Trust is a ubiquitous term used in emerging technology (e.g., Big Data, precision medicine), innovation policy, and governance literatures in particular. But what exactly is trust? Even though trust is considered a critical requirement for the successful deployment of precision medicine initiatives, nonetheless, there is a need for further conceptualization with regard to what qualifies as trust, and what factors might establish and sustain trust in precision medicine, predictive analytics, and large-scale biology. These new fields of 21st century medicine and health often deal with the “futures” and hence, trust gains a temporal and ever-present quality for both the present and the futures anticipated by new technologies and predictive analytics. We address these conceptual gaps that have important practical implications in the way we govern risk and unknowns associated with emerging technologies in biology, medicine, and health broadly. We provide an in-depth conceptual analysis and an operative definition of trust dynamics in precision medicine. In addition, we identify three main types of “trust facilitators”: (1) technical, (2) ethical, and (3) institutional. This three-dimensional framework on trust is necessary to building and maintaining trust in 21st century knowledge-based innovations that governments and publics invest for progressive societal change, development, and sustainable prosperity. Importantly, we analyze, identify, and deliberate on the dimensions of precision medicine and large-scale biology that have carved out trust as a pertinent tool to its success. Moving forward, we propose a “points to consider” on how best to enhance trust in precision medicine and predictive analytics. PMID:29257733
What Is Trust? Ethics and Risk Governance in Precision Medicine and Predictive Analytics.
Adjekum, Afua; Ienca, Marcello; Vayena, Effy
2017-12-01
Trust is a ubiquitous term used in emerging technology (e.g., Big Data, precision medicine), innovation policy, and governance literatures in particular. But what exactly is trust? Even though trust is considered a critical requirement for the successful deployment of precision medicine initiatives, nonetheless, there is a need for further conceptualization with regard to what qualifies as trust, and what factors might establish and sustain trust in precision medicine, predictive analytics, and large-scale biology. These new fields of 21st century medicine and health often deal with the "futures" and hence, trust gains a temporal and ever-present quality for both the present and the futures anticipated by new technologies and predictive analytics. We address these conceptual gaps that have important practical implications in the way we govern risk and unknowns associated with emerging technologies in biology, medicine, and health broadly. We provide an in-depth conceptual analysis and an operative definition of trust dynamics in precision medicine. In addition, we identify three main types of "trust facilitators": (1) technical, (2) ethical, and (3) institutional. This three-dimensional framework on trust is necessary to building and maintaining trust in 21st century knowledge-based innovations that governments and publics invest for progressive societal change, development, and sustainable prosperity. Importantly, we analyze, identify, and deliberate on the dimensions of precision medicine and large-scale biology that have carved out trust as a pertinent tool to its success. Moving forward, we propose a "points to consider" on how best to enhance trust in precision medicine and predictive analytics.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.
Sato, Tatsuhiko
2015-01-01
By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Applying Pragmatics Principles for Interaction with Visual Analytics.
Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac
2018-01-01
Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.
Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs
ERIC Educational Resources Information Center
Veregin, Howard
2015-01-01
Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…
This presentation will provide a conceptual preview of an Area of Review (AoR) tool being developed by EPA’s Office of Research and Development that applies analytic and semi-analytical mathematical solutions to elucidate potential risks associated with geologic sequestration of ...
The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate
ERIC Educational Resources Information Center
Cárdenas-Navia, Isabel; Fitzgerald, Brian K.
2015-01-01
New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…
An Analytical Hierarchy Process Model for the Evaluation of College Experimental Teaching Quality
ERIC Educational Resources Information Center
Yin, Qingli
2013-01-01
Taking into account the characteristics of college experimental teaching, through investigaton and analysis, evaluation indices and an Analytical Hierarchy Process (AHP) model of experimental teaching quality have been established following the analytical hierarchy process method, and the evaluation indices have been given reasonable weights. An…
Kumar, B Vinodh; Mohan, Thuthi
2018-01-01
Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.
Hackl, Matthias; Heilmeier, Ursula; Weilner, Sylvia; Grillari, Johannes
2016-09-05
Biomarkers are essential tools in clinical research and practice. Useful biomarkers must combine good measurability, validated association with biological processes or outcomes, and should support clinical decision making if used in clinical practice. Several types of validated biomarkers have been reported in the context of bone diseases. However, because these biomarkers face certain limitations there is an interest in the identification of novel biomarkers for bone diseases, specifically in those that are tightly linked to the disease pathology leading to increased fracture-risk. MicroRNAs (miRNAs) are the most abundant RNA species to be found in cell-free blood. Encapsulated within microvesicles or bound to proteins, circulating miRNAs are remarkably stable analytes that can be measured using gold-standard technologies such as quantitative polymerase-chain-reaction (qPCR). Nevertheless, the analysis of circulating miRNAs faces several pre-analytical as well as analytical challenges. From a biological view, there is accumulating evidence that miRNAs play essential roles in the regulation of various biological processes including bone homeostasis. Moreover, specific changes in miRNA transcription levels or miRNA secretory levels have been linked to the development and progression of certain bone diseases. Only recently, results from circulating miRNAs analysis in patients with osteopenia, osteoporosis and fragility fractures have been reported. By comparing these findings to studies on circulating miRNAs in cellular senescence and aging or muscle physiology and sarcopenia, several overlaps were observed. This suggests that signatures observed during osteoporosis might not be specific to the pathophysiology in bone, but rather integrate information from several tissue types. Despite these promising first data, more work remains to be done until circulating miRNAs can serve as established and robust diagnostic tools for bone diseases in clinical research, clinical routine and in personalized medicine. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Integrated Data & Analysis in Support of Informed and Transparent Decision Making
NASA Astrophysics Data System (ADS)
Guivetchi, K.
2012-12-01
The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.
Accelerated bridge construction (ABC) decision making and economic modeling tool.
DOT National Transportation Integrated Search
2011-12-01
In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2013 CFR
2013-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
17 CFR 49.17 - Access to SDR data.
Code of Federal Regulations, 2014 CFR
2014-04-01
... legal and statutory responsibilities under the Act and related regulations. (2) Monitoring tools. A registered swap data repository is required to provide the Commission with proper tools for the monitoring... data structure and content. These monitoring tools shall be substantially similar in analytical...
Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview
ERIC Educational Resources Information Center
Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans
2017-01-01
Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Big data analytics in immunology: a knowledge-based approach.
Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir
2014-01-01
With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.
Update on SLD Engineering Tools Development
NASA Technical Reports Server (NTRS)
Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.
2004-01-01
The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.
76 FR 70517 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-14
... requested. These systems generally also provide analytics, spreadsheets, and other tools designed to enable funds to analyze the data presented, as well as communication tools to process fund instructions...
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroniger, K; Herzog, M; Landry, G
2015-06-15
Purpose: We describe and demonstrate a fast analytical tool for prompt-gamma emission prediction based on filter functions applied on the depth dose profile. We present the implementation in a treatment planning system (TPS) of the same algorithm for positron emitter distributions. Methods: The prediction of the desired observable is based on the convolution of filter functions with the depth dose profile. For both prompt-gammas and positron emitters, the results of Monte Carlo simulations (MC) are compared with those of the analytical tool. For prompt-gamma emission from inelastic proton-induced reactions, homogeneous and inhomogeneous phantoms alongside with patient data are used asmore » irradiation targets of mono-energetic proton pencil beams. The accuracy of the tool is assessed in terms of the shape of the analytically calculated depth profiles and their absolute yields, compared to MC. For the positron emitters, the method is implemented in a research RayStation TPS and compared to MC predictions. Digital phantoms and patient data are used and positron emitter spatial density distributions are analyzed. Results: Calculated prompt-gamma profiles agree with MC within 3 % in terms of absolute yield and reproduce the correct shape. Based on an arbitrary reference material and by means of 6 filter functions (one per chemical element), profiles in any other material composed of those elements can be predicted. The TPS implemented algorithm is accurate enough to enable, via the analytically calculated positron emitters profiles, detection of range differences between the TPS and MC with errors of the order of 1–2 mm. Conclusion: The proposed analytical method predicts prompt-gamma and positron emitter profiles which generally agree with the distributions obtained by a full MC. The implementation of the tool in a TPS shows that reliable profiles can be obtained directly from the dose calculated by the TPS, without the need of full MC simulation.« less
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
Markert, Sven; Joeris, Klaus
2017-01-01
We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman
2011-06-01
This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.
Whalen, Kimberly; Bavuso, Karen; Bouyer-Ferullo, Sharon; Goldsmith, Denise; Fairbanks, Amanda; Gesner, Emily; Lagor, Charles; Collins, Sarah
2016-01-01
To understand requests for nursing Clinical Decision Support (CDS) interventions at a large integrated health system undergoing vendor-based EHR implementation. In addition, to establish a process to guide both short-term implementation and long-term strategic goals to meet nursing CDS needs. We conducted an environmental scan to understand current state of nursing CDS over three months. The environmental scan consisted of a literature review and an analysis of CDS requests received from across our health system. We identified existing high priority CDS and paper-based tools used in nursing practice at our health system that guide decision-making. A total of 46 nursing CDS requests were received. Fifty-six percent (n=26) were specific to a clinical specialty; 22 percent (n=10) were focused on facilitating clinical consults in the inpatient setting. "Risk Assessments/Risk Reduction/Promotion of Healthy Habits" (n=23) was the most requested High Priority Category received for nursing CDS. A continuum of types of nursing CDS needs emerged using the Data-Information-Knowledge-Wisdom Conceptual Framework: 1) facilitating data capture, 2) meeting information needs, 3) guiding knowledge-based decision making, and 4) exposing analytics for wisdom-based clinical interpretation by the nurse. Identifying and prioritizing paper-based tools that can be modified into electronic CDS is a challenge. CDS strategy is an evolving process that relies on close collaboration and engagement with clinical sites for short-term implementation and should be incorporated into a long-term strategic plan that can be optimized and achieved overtime. The Data-Information-Knowledge-Wisdom Conceptual Framework in conjunction with the High Priority Categories established may be a useful tool to guide a strategic approach for meeting short-term nursing CDS needs and aligning with the organizational strategic plan.
SMARTe is being developed to give stakeholders information resources, analytical tools, communication strategies, and a decision analysis approach to be able to make better decisions regarding future uses of property. The development of the communication tools and decision analys...
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
Advances in solid-state NMR of cellulose.
Foston, Marcus
2014-06-01
Nuclear magnetic resonance (NMR) spectroscopy is a well-established analytical and enabling technology in biofuel research. Over the past few decades, lignocellulosic biomass and its conversion to supplement or displace non-renewable feedstocks has attracted increasing interest. The application of solid-state NMR spectroscopy has long been seen as an important tool in the study of cellulose and lignocellulose structure, biosynthesis, and deconstruction, especially considering the limited number of effective solvent systems and the significance of plant cell wall three-dimensional microstructure and component interaction to conversion yield and rate profiles. This article reviews common and recent applications of solid-state NMR spectroscopy methods that provide insight into the structural and dynamic processes of cellulose that control bulk properties and biofuel conversion. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dynamic Impact Testing and Model Development in Support of NASA's Advanced Composites Program
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Pereira, J. Michael; Goldberg, Robert; Rassaian, Mostafa
2018-01-01
The purpose of this paper is to provide an executive overview of the HEDI effort for NASA's Advanced Composites Program and establish the foundation for the remaining papers to follow in the 2018 SciTech special session NASA ACC High Energy Dynamic Impact. The paper summarizes the work done for the Advanced Composites Program to advance our understanding of the behavior of composite materials during high energy impact events and to advance the ability of analytical tools to provide predictive simulations. The experimental program carried out at GRC is summarized and a status on the current development state for MAT213 will be provided. Future work will be discussed as the HEDI effort transitions from fundamental analysis and testing to investigating sub-component structural concept response to impact events.
Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays
Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.
2017-01-01
Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report discusses the sample tracking database in use at the Idaho National Engineering Laboratory (INEL) by the Radiation Measurements Laboratory (RML) and Analytical Radiochemistry. The database was designed in-house to meet the specific needs of the RML and Analytical Radiochemistry. The report consists of two parts, a user`s guide and a reference guide. The user`s guide presents some of the fundamentals needed by anyone who will be using the database via its user interface. The reference guide describes the design of both the database and the user interface. Briefly mentioned in the reference guide are the code-generating tools, CREATE-SCHEMAmore » and BUILD-SCREEN, written to automatically generate code for the database and its user interface. The appendices contain the input files used by the these tools to create code for the sample tracking database. The output files generated by these tools are also included in the appendices.« less
Hosu, Anamaria; Cristea, Vasile-Mircea; Cimpoiu, Claudia
2014-05-01
Wine is one of the most consumed beverages over the world containing large quantities of polyphenolic compounds. These compounds are responsible for quality of red wines, influencing the antioxidant activity, astringency, bitterness and colour, their composition in wine being influenced by the varieties, the vintage and the wineries. The aim of the present work is to build software instruments intended to work as data-mining tools for predicting valuable properties of wine and for revealing different wine classes. The developed ANNs are able to reveal the relationships between the concentration of total phenolic, flavonoids, anthocyanins, and tannins content, associated to the antioxidant activity, and the wine distinctive classes determined by the wine variety, harvesting year or winery. The presented ANNs proved to be reliable software tools for assessment or validation of the wine essential characteristics and authenticity and may be further used to establish a database of analytical characteristics of wines. Copyright © 2013 Elsevier Ltd. All rights reserved.
Biomarkers as drug development tools: discovery, validation, qualification and use.
Kraus, Virginia B
2018-06-01
The 21st Century Cures Act, approved in the USA in December 2016, has encouraged the establishment of the national Precision Medicine Initiative and the augmentation of efforts to address disease prevention, diagnosis and treatment on the basis of a molecular understanding of disease. The Act adopts into law the formal process, developed by the FDA, of qualification of drug development tools, including biomarkers and clinical outcome assessments, to increase the efficiency of clinical trials and encourage an era of molecular medicine. The FDA and European Medicines Agency (EMA) have developed similar processes for the qualification of biomarkers intended for use as companion diagnostics or for development and regulatory approval of a drug or therapeutic. Biomarkers that are used exclusively for the diagnosis, monitoring or stratification of patients in clinical trials are not subject to regulatory approval, although their qualification can facilitate the conduct of a trial. In this Review, the salient features of biomarker discovery, analytical validation, clinical qualification and utilization are described in order to provide an understanding of the process of biomarker development and, through this understanding, convey an appreciation of their potential advantages and limitations.
NASA Technical Reports Server (NTRS)
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Software Tools on the Peregrine System | High-Performance Computing | NREL
Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
NASTRAN as an analytical research tool for composite mechanics and composite structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.
1976-01-01
Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.
Lee, Choong H; Flint, Jeremy J; Hansen, Brian; Blackband, Stephen J
2015-06-10
Magnetic resonance microscopy (MRM) is a non-invasive diagnostic tool which is well-suited to directly resolve cellular structures in ex vivo and in vitro tissues without use of exogenous contrast agents. Recent advances in its capability to visualize mammalian cellular structure in intact tissues have reinvigorated analytical interest in aquatic cell models whose previous findings warrant up-to-date validation of subcellular components. Even if the sensitivity of MRM is less than other microscopic technologies, its strength lies in that it relies on the same image contrast mechanisms as clinical MRI which make it a unique tool for improving our ability to interpret human diagnostic imaging through high resolution studies of well-controlled biological model systems. Here, we investigate the subcellular MR signal characteristics of isolated cells of Aplysia californica at an in-plane resolution of 7.8 μm. In addition, direct correlation and positive identification of subcellular architecture in the cells is achieved through well-established histology. We hope this methodology will serve as the groundwork for studying pathophysiological changes through perturbation studies and allow for development of disease-specific cellular modeling tools. Such an approach promises to reveal the MR contrast changes underlying cellular mechanisms in various human diseases, for example in ischemic stroke.
Mid-frequency Band Dynamics of Large Space Structures
NASA Technical Reports Server (NTRS)
Coppolino, Robert N.; Adams, Douglas S.
2004-01-01
High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.
Durham, Erin-Elizabeth A; Yu, Xiaxia; Harrison, Robert W
2014-12-01
Effective machine-learning handles large datasets efficiently. One key feature of handling large data is the use of databases such as MySQL. The freeware fuzzy decision tree induction tool, FDT, is a scalable supervised-classification software tool implementing fuzzy decision trees. It is based on an optimized fuzzy ID3 (FID3) algorithm. FDT 2.0 improves upon FDT 1.0 by bridging the gap between data science and data engineering: it combines a robust decisioning tool with data retention for future decisions, so that the tool does not need to be recalibrated from scratch every time a new decision is required. In this paper we briefly review the analytical capabilities of the freeware FDT tool and its major features and functionalities; examples of large biological datasets from HIV, microRNAs and sRNAs are included. This work shows how to integrate fuzzy decision algorithms with modern database technology. In addition, we show that integrating the fuzzy decision tree induction tool with database storage allows for optimal user satisfaction in today's Data Analytics world.
Modeling Biodegradation and Reactive Transport: Analytical and Numerical Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Y; Glascoe, L
The computational modeling of the biodegradation of contaminated groundwater systems accounting for biochemical reactions coupled to contaminant transport is a valuable tool for both the field engineer/planner with limited computational resources and the expert computational researcher less constrained by time and computer power. There exists several analytical and numerical computer models that have been and are being developed to cover the practical needs put forth by users to fulfill this spectrum of computational demands. Generally, analytical models provide rapid and convenient screening tools running on very limited computational power, while numerical models can provide more detailed information with consequent requirementsmore » of greater computational time and effort. While these analytical and numerical computer models can provide accurate and adequate information to produce defensible remediation strategies, decisions based on inadequate modeling output or on over-analysis can have costly and risky consequences. In this chapter we consider both analytical and numerical modeling approaches to biodegradation and reactive transport. Both approaches are discussed and analyzed in terms of achieving bioremediation goals, recognizing that there is always a tradeoff between computational cost and the resolution of simulated systems.« less
Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam
2017-06-01
The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT
This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...
Structuring modeling and simulation analysis for evacuation planning and operations.
DOT National Transportation Integrated Search
2009-06-01
This document is intended to provide guidance to decision-makers at agencies and jurisdictions considering the role of analytical tools in evacuation planning and operations. It is often unclear what kind of analytical approach may be of most value, ...
NASA Astrophysics Data System (ADS)
James, C. M.; Gildfind, D. E.; Lewis, S. W.; Morgan, R. G.; Zander, F.
2018-03-01
Expansion tubes are an important type of test facility for the study of planetary entry flow-fields, being the only type of impulse facility capable of simulating the aerothermodynamics of superorbital planetary entry conditions from 10 to 20 km/s. However, the complex flow processes involved in expansion tube operation make it difficult to fully characterise flow conditions, with two-dimensional full facility computational fluid dynamics simulations often requiring tens or hundreds of thousands of computational hours to complete. In an attempt to simplify this problem and provide a rapid flow condition prediction tool, this paper presents a validated and comprehensive analytical framework for the simulation of an expansion tube facility. It identifies central flow processes and models them from state to state through the facility using established compressible and isentropic flow relations, and equilibrium and frozen chemistry. How the model simulates each section of an expansion tube is discussed, as well as how the model can be used to simulate situations where flow conditions diverge from ideal theory. The model is then validated against experimental data from the X2 expansion tube at the University of Queensland.
Scaled Particle Theory for Multicomponent Hard Sphere Fluids Confined in Random Porous Media.
Chen, W; Zhao, S L; Holovko, M; Chen, X S; Dong, W
2016-06-23
The formulation of scaled particle theory (SPT) is presented for a quite general model of fluids confined in a random porous media, i.e., a multicomponent hard sphere (HS) fluid in a multicomponent hard sphere or a multicomponent overlapping hard sphere (OHS) matrix. The analytical expressions for pressure, Helmholtz free energy, and chemical potential are derived. The thermodynamic consistency of the proposed theory is established. Moreover, we show that there is an isomorphism between the SPT for a multicomponent system and that for a one-component system. Results from grand canonical ensemble Monte Carlo simulations are also presented for a binary HS mixture in a one-component HS or a one-component OHS matrix. The accuracy of various variants derived from the basic SPT formulation is appraised against the simulation results. Scaled particle theory, initially formulated for a bulk HS fluid, has not only provided an analytical tool for calculating thermodynamic properties of HS fluid but also helped to gain very useful insight for elaborating other theoretical approaches such as the fundamental measure theory (FMT). We expect that the general SPT for multicomponent systems developed in this work can contribute to the study of confined fluids in a similar way.
NASA Astrophysics Data System (ADS)
Kerlin, Steven C.; Carlsen, William S.; Kelly, Gregory J.; Goehring, Elizabeth
2013-08-01
The conception of Global Learning Communities (GLCs) was researched to discover potential benefits of the use of online technologies that facilitated communication and scientific data sharing outside of the normal classroom setting. 1,419 students in 635 student groups began the instructional unit. Students represented the classrooms of 33 teachers from the USA, 6 from Thailand, 7 from Australia, and 4 from Germany. Data from an international environmental education project were analyzed to describe grades 7-9 student scientific writing in domestic US versus international-US classroom online partnerships. The development of an argument analytic and a research model of exploratory data analysis followed by statistical testing were used to discover and highlight different ways students used evidence to support their scientific claims about temperature variation at school sites and deep-sea hydrothermal vents. Findings show modest gains in the use of some evidentiary discourse components by US students in international online class partnerships compared to their US counterparts in domestic US partnerships. The analytic, research model, and online collaborative learning tools may be used in other large-scale studies and learning communities. Results provide insights about the benefits of using online technologies and promote the establishment of GLCs.
Vial, Jérôme; Pezous, Benoît; Thiébaut, Didier; Sassiat, Patrick; Teillet, Béatrice; Cahours, Xavier; Rivals, Isabelle
2011-01-30
GCxGC is now recognized as the most suited analytical technique for the characterization of complex mixtures of volatile compounds; it is implemented worldwide in academic and industrial laboratories. However, in the frame of comprehensive analysis of non-target analytes, going beyond the visual examination of the color plots remains challenging for most users. We propose a strategy that aims at classifying chromatograms according to the chemical composition of the samples while determining the origin of the discrimination between different classes of samples: the discriminant pixel approach. After data pre-processing and time-alignment, the discriminatory power of each chromatogram pixel for a given class was defined as its correlation with the membership to this class. Using a peak finding algorithm, the most discriminant pixels were then linked to chromatographic peaks. Finally, crosschecking with mass spectrometry data enabled to establish relationships with compounds that could consequently be considered as candidate class markers. This strategy was applied to a large experimental data set of 145 GCxGC-MS chromatograms of tobacco extracts corresponding to three distinct classes of tobacco. Copyright © 2010 Elsevier B.V. All rights reserved.
Wine biotechnology in South Africa: towards a systems approach to wine science.
Moore, John P; Divol, Benoit; Young, Philip R; Nieuwoudt, Hélène H; Ramburan, Viresh; du Toit, Maret; Bauer, Florian F; Vivier, Melané A
2008-11-01
The wine industry in South Africa is over three centuries old and over the last decade has reemerged as a significant competitor in world wine markets. The Institute for Wine Biotechnology (IWBT) was established in partnership with the Department of Viticulture and Oenology at Stellenbosch University to foster basic fundamental research in the wine sciences leading to applications in the broader wine and grapevine industries. This review focuses on the different research programmes of the Institute (grapevine, yeast and bacteria biotechnology programmes, and chemical-analytical research), commercialisation activities (SunBio) and new initiatives to integrate the various research disciplines. An important focus of future research is the Wine Science Research Niche Area programme, which connects the different research thrusts of the IWBT and of several research partners in viticulture, oenology, food science and chemistry. This 'Functional Wine-omics' programme uses a systems biology approach to wine-related organisms. The data generated within the programme will be integrated with other data sets from viticulture, oenology, analytical chemistry and the sensory sciences through chemometrics and other statistical tools. The aim of the programme is to model aspects of the wine making process, from the vineyard to the finished product.
Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.
Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà
2017-10-01
Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.
Analytical Tools Interface for Landscape Assessments
Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...
SolarPILOT | Concentrating Solar Power | NREL
tools. Unlike exclusively ray-tracing tools, SolarPILOT runs the analytical simulation engine that uses engine alongside a ray-tracing core for more detailed simulations. The SolTrace simulation engine is
The sweet and sour of serological glycoprotein tumor biomarker quantification
2013-01-01
Aberrant and dysregulated protein glycosylation is a well-established event in the process of oncogenesis and cancer progression. Years of study on the glycobiology of cancer have been focused on the development of clinically viable diagnostic applications of this knowledge. However, for a number of reasons, there has been only sparse and varied success. The causes of this range from technical to biological issues that arise when studying protein glycosylation and attempting to apply it to practical applications. This review focuses on the pitfalls, advances, and future directions to be taken in the development of clinically applicable quantitative assays using glycan moieties from serum-based proteins as analytes. Topics covered include the development and progress of applications of lectins, mass spectrometry, and other technologies towards this purpose. Slowly but surely, novel applications of established and development of new technologies will eventually provide us with the tools to reach the ultimate goal of quantification of the full scope of heterogeneity associated with the glycosylation of biomarker candidate glycoproteins in a clinically applicable fashion. PMID:23390961
Solar dynamic power for the Space Station
NASA Technical Reports Server (NTRS)
Archer, J. S.; Diamant, E. S.
1986-01-01
This paper describes a computer code which provides a significant advance in the systems analysis capabilities of solar dynamic power modules. While the code can be used to advantage in the preliminary analysis of terrestrial solar dynamic modules its real value lies in the adaptions which make it particularly useful for the conceptualization of optimized power modules for space applications. In particular, as illustrated in the paper, the code can be used to establish optimum values of concentrator diameter, concentrator surface roughness, concentrator rim angle and receiver aperture corresponding to the main heat cycle options - Organic Rankine and Brayton - and for certain receiver design options. The code can also be used to establish system sizing margins to account for the loss of reflectivity in orbit or the seasonal variation of insolation. By the simulation of the interactions among the major components of a solar dynamic module and through simplified formulations of the major thermal-optic-thermodynamic interactions the code adds a powerful, efficient and economic analytical tool to the repertory of techniques available for the design of advanced space power systems.
Pseudotemporal Ordering of Single Cells Reveals Metabolic Control of Postnatal β Cell Proliferation.
Zeng, Chun; Mulas, Francesca; Sui, Yinghui; Guan, Tiffany; Miller, Nathanael; Tan, Yuliang; Liu, Fenfen; Jin, Wen; Carrano, Andrea C; Huising, Mark O; Shirihai, Orian S; Yeo, Gene W; Sander, Maike
2017-05-02
Pancreatic β cell mass for appropriate blood glucose control is established during early postnatal life. β cell proliferative capacity declines postnatally, but the extrinsic cues and intracellular signals that cause this decline remain unknown. To obtain a high-resolution map of β cell transcriptome dynamics after birth, we generated single-cell RNA-seq data of β cells from multiple postnatal time points and ordered cells based on transcriptional similarity using a new analytical tool. This analysis captured signatures of immature, proliferative β cells and established high expression of amino acid metabolic, mitochondrial, and Srf/Jun/Fos transcription factor genes as their hallmark feature. Experimental validation revealed high metabolic activity in immature β cells and a role for reactive oxygen species and Srf/Jun/Fos transcription factors in driving postnatal β cell proliferation and mass expansion. Our work provides the first high-resolution molecular characterization of state changes in postnatal β cells and paves the way for the identification of novel therapeutic targets to stimulate β cell regeneration. Copyright © 2017 Elsevier Inc. All rights reserved.
2014-01-01
Biomarker research is continuously expanding in the field of clinical proteomics. A combination of different proteomic–based methodologies can be applied depending on the specific clinical context of use. Moreover, current advancements in proteomic analytical platforms are leading to an expansion of biomarker candidates that can be identified. Specifically, mass spectrometric techniques could provide highly valuable tools for biomarker research. Ideally, these advances could provide with biomarkers that are clinically applicable for disease diagnosis and/ or prognosis. Unfortunately, in general the biomarker candidates fail to be implemented in clinical decision making. To improve on this current situation, a well-defined study design has to be established driven by a clear clinical need, while several checkpoints between the different phases of discovery, verification and validation have to be passed in order to increase the probability of establishing valid biomarkers. In this review, we summarize the technical proteomic platforms that are available along the different stages in the biomarker discovery pipeline, exemplified by clinical applications in the field of bladder cancer biomarker research. PMID:24679154
Schermeyer, Marie-Therese; Wöll, Anna K.; Eppink, Michel; Hubbuch, Jürgen
2017-01-01
ABSTRACT High protein titers are gaining importance in biopharmaceutical industry. A major challenge in the development of highly concentrated mAb solutions is their long-term stability and often incalculable viscosity. The complexity of the molecule itself, as well as the various molecular interactions, make it difficult to describe their solution behavior. To study the formulation stability, long- and short-range interactions and the formation of complex network structures have to be taken into account. For a better understanding of highly concentrated solutions, we combined established and novel analytical tools to characterize the effect of solution properties on the stability of highly concentrated mAb formulations. In this study, monoclonal antibody solutions in a concentration range of 50–200 mg/ml at pH 5–9 with and without glycine, PEG4000, and Na2SO4 were analyzed. To determine the monomer content, analytical size-exclusion chromatography runs were performed. ζ-potential measurements were conducted to analyze the electrophoretic properties in different solutions. The melting and aggregation temperatures were determined with the help of fluorescence and static light scattering measurements. Additionally, rheological measurements were conducted to study the solution viscosity and viscoelastic behavior of the mAb solutions. The so-determined analytical parameters were scored and merged in an analytical toolbox. The resulting scoring was then successfully correlated with long-term storage (40 d of incubation) experiments. Our results indicate that the sensitivity of complex rheological measurements, in combination with the applied techniques, allows reliable statements to be made with respect to the effect of solution properties, such as protein concentration, ionic strength, and pH shift, on the strength of protein-protein interaction and solution colloidal stability. PMID:28617076
Pinder, Nadine; Brenner, Thorsten; Swoboda, Stefanie; Weigand, Markus A; Hoppe-Tichy, Torsten
2017-09-05
Therapeutic drug monitoring (TDM) is a useful tool to optimize antibiotic therapy. Increasing interest in alternative dosing strategies of beta-lactam antibiotics, e.g. continuous or prolonged infusion, require a feasible analytical method for quantification of these antimicrobial agents. However, pre-analytical issues including sample handling and stability are to be considered to provide valuable analytical results. For the simultaneous determination of piperacillin, meropenem, ceftazidime and flucloxacillin, a high performance liquid chromatography (HPLC) method including protein precipitation was established utilizing ertapenem as internal standard. Long-term stability of stock solutions and plasma samples were monitored. Furthermore, whole blood stability of the analytes in heparinized blood tubes was investigated comparing storage under ambient conditions and 2-8°C. A calibration range of 5-200μg/ml (piperacillin, ceftazidime, flucloxacillin) and 2-200μg/ml (meropenem) was linear with r 2 >0.999, precision and inaccuracy were <9% and <11%, respectively. The successfully validated HPLC assay was applied to clinical samples and stability investigations. At -80°C, plasma samples were stable for 9 months (piperacillin, meropenem) or 13 months (ceftazidime, flucloxacillin). Concentrations of the four beta-lactam antibiotics in whole blood tubes were found to remain within specifications for 8h when stored at 2-8°C but not at room temperature. The presented method is a rapid and simple option for routine TDM of piperacillin, meropenem, ceftazidime and flucloxacillin. Whereas long-term storage of beta-lactam samples at -80°C is possible for at least 9 months, whole blood tubes are recommended to be kept refrigerated until analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Marom, Ziv; Shtein, Ilana; Bar-On, Benny
2017-01-01
Stomata are pores on the leaf surface, which are formed by a pair of curved, tubular guard cells; an increase in turgor pressure deforms the guard cells, resulting in the opening of the stomata. Recent studies employed numerical simulations, based on experimental data, to analyze the effects of various structural, chemical, and mechanical features of the guard cells on the stomatal opening characteristics; these studies all support the well-known qualitative observation that the mechanical anisotropy of the guard cells plays a critical role in stomatal opening. Here, we propose a computationally based analytical model that quantitatively establishes the relations between the degree of anisotropy of the guard cell, the bio-composite constituents of the cell wall, and the aperture and area of stomatal opening. The model introduces two non-dimensional key parameters that dominate the guard cell deformations—the inflation driving force and the anisotropy ratio—and it serves as a generic framework that is not limited to specific plant species. The modeling predictions are in line with a wide range of previous experimental studies, and its analytical formulation sheds new light on the relations between the structure, mechanics, and function of stomata. Moreover, the model provides an analytical tool to back-calculate the elastic characteristics of the matrix that composes the guard cell walls, which, to the best of our knowledge, cannot be probed by direct nano-mechanical experiments; indeed, the estimations of our model are in good agreement with recently published results of independent numerical optimization schemes. The emerging insights from the stomatal structure-mechanics “design guidelines” may promote the development of miniature, yet complex, multiscale composite actuation mechanisms for future engineering platforms. PMID:29312365
A results-based process for evaluation of diverse visual analytics tools
NASA Astrophysics Data System (ADS)
Rubin, Gary; Berger, David H.
2013-05-01
With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.
CancerLectinDB: a database of lectins relevant to cancer.
Damodaran, Deepa; Jeyakani, Justin; Chauhan, Alok; Kumar, Nirmal; Chandra, Nagasuma R; Surolia, Avadhesha
2008-04-01
The role of lectins in mediating cancer metastasis, apoptosis as well as various other signaling events has been well established in the past few years. Data on various aspects of the role of lectins in cancer is being accumulated at a rapid pace. The data on lectins available in the literature is so diverse, that it becomes difficult and time-consuming, if not impossible to comprehend the advances in various areas and obtain the maximum benefit. Not only do the lectins vary significantly in their individual functional roles, but they are also diverse in their sequences, structures, binding site architectures, quaternary structures, carbohydrate affinities and specificities as well as their potential applications. An organization of these seemingly independent data into a common framework is essential in order to achieve effective use of all the data towards understanding the roles of different lectins in different aspects of cancer and any resulting applications. An integrated knowledge base (CancerLectinDB) together with appropriate analytical tools has therefore been developed for lectins relevant for any aspect of cancer, by collating and integrating diverse data. This database is unique in terms of providing sequence, structural, and functional annotations for lectins from all known sources in cancer and is expected to be a useful addition to the number of glycan related resources now available to the community. The database has been implemented using MySQL on a Linux platform and web-enabled using Perl-CGI and Java tools. Data for individual lectins pertain to taxonomic, biochemical, domain architecture, molecular sequence and structural details as well as carbohydrate specificities. Extensive links have also been provided for relevant bioinformatics resources and analytical tools. Availability of diverse data integrated into a common framework is expected to be of high value for various studies on lectin cancer biology. CancerLectinDB can be accessed through http://proline.physics.iisc.ernet.in/cancerdb .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayden, D. W.
This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried tomore » develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of Hatler et. al., and the reactive temperature rise will be obtained from Mader's work. Finally, the assessment of when a detonation occurs will be derived from Bowden and Yoffe's thermal explosion theory (hot spot).« less
Analytical fingerprint for tantalum ores from African deposits
NASA Astrophysics Data System (ADS)
Melcher, F.; Graupner, T.; Sitnikova, M.; Oberthür, T.; Henjes-Kunst, F.; Gäbler, E.; Rantitsch, G.
2009-04-01
Illegal mining of gold, diamonds, copper, cobalt and, in the last decade, "coltan" has fuelled ongoing armed conflicts and civil war in a number of African countries. Following the United Nations initiative to fingerprint the origin of conflict materials and to develop a traceability system, our working group is investigating "coltan" (i.e. columbite-tantalite) mineralization especially in Africa, also within the wider framework of establishing certified trading chains (CTC). Special attention is directed towards samples from the main Ta-Nb-Sn provinces in Africa: DR Congo, Rwanda, Mozambique, Ethiopia, Egypt and Namibia. The following factors are taken into consideration in a methodological approach capable of distinguishing the origin of tantalum ores and concentrates with the utmost probability: (1) Quality and composition of coltan concentrates vary considerably. (2) Mineralogical and chemical compositions of Ta-Nb ores are extremely complex due to the wide range of the columbite-tantalite solid solution series and its ability to incorporate many additional elements. (3) Coltan concentrates may contain a number of other tantalum-bearing minerals besides columbite-tantalite. In our approach, coltan concentrates are analyzed in a step-by-step mode. State-of-the-art analytical tools employed are automated scanning electron microscopy (Mineral Liberation Analysis; MLA), electron microprobe analysis (major and trace elements), laser ablation-ICP-MS (trace elements, isotopes), and TIMS (U-Pb dating). Mineral assemblages in the ore concentrates, major and trace element concentration patterns, and zoning characteristics in the different pegmatites from Africa distinctly differ from each other. Chondrite-normalized REE distribution patterns vary significantly between columbite, tantalite, and microlite, and also relative to major element compositions of columbites. Some locations are characterized by low REE concentrations, others are highly enriched. Samples with Kibaran age either show flat patterns for most tantalites, rising values from the LREE to the HREE, or trough-like patterns. Eu anomalies are strongly negative in columbite-tantalite from the Alto Ligonha Province in Mozambique, from the Namaqualand Province (Namibia, South Africa), and from Zimbabwe. Four main age populations of coltan deposits in Africa were revealed: (1) Archean (>2.5 Ga), (2) Paleoproterozoic (2.1-1.9 Ga), (3) early Neoproterozoic ("Kibaran", 1.0-0.9 Ga), and (4) late Neoproterozoic to early Paleozoic (Pan-African; ca. 0.6-0.4 Ga). Currently, we focus on the resolution of the fingerprinting system from region via ore province down to deposit scale, establishing a large and high-quality analytical data base, and developing fast-screening and low-cost methods. Analytical flow-charts and identification schemes for coltan ores will be presented at the Conference. The analytical results obtained so far indicate that a certification scheme including fingerprinting of sources of coltan ores is feasible. The methodology developed is capable to assist in the establishment of a control instrument in an envisaged certification of the production and trade chain of coltan.
Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng
2007-07-01
To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.
Buschmann, Dominik; Haberberger, Anna; Kirchner, Benedikt; Spornraft, Melanie; Riedmaier, Irmgard; Schelling, Gustav; Pfaffl, Michael W.
2016-01-01
Small RNA-Seq has emerged as a powerful tool in transcriptomics, gene expression profiling and biomarker discovery. Sequencing cell-free nucleic acids, particularly microRNA (miRNA), from liquid biopsies additionally provides exciting possibilities for molecular diagnostics, and might help establish disease-specific biomarker signatures. The complexity of the small RNA-Seq workflow, however, bears challenges and biases that researchers need to be aware of in order to generate high-quality data. Rigorous standardization and extensive validation are required to guarantee reliability, reproducibility and comparability of research findings. Hypotheses based on flawed experimental conditions can be inconsistent and even misleading. Comparable to the well-established MIQE guidelines for qPCR experiments, this work aims at establishing guidelines for experimental design and pre-analytical sample processing, standardization of library preparation and sequencing reactions, as well as facilitating data analysis. We highlight bottlenecks in small RNA-Seq experiments, point out the importance of stringent quality control and validation, and provide a primer for differential expression analysis and biomarker discovery. Following our recommendations will encourage better sequencing practice, increase experimental transparency and lead to more reproducible small RNA-Seq results. This will ultimately enhance the validity of biomarker signatures, and allow reliable and robust clinical predictions. PMID:27317696
Manipulability, force, and compliance analysis for planar continuum manipulators
NASA Technical Reports Server (NTRS)
Gravagne, Ian A.; Walker, Ian D.
2002-01-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Manipulability, force, and compliance analysis for planar continuum manipulators.
Gravagne, Ian A; Walker, Ian D
2002-06-01
Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.
Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D
2016-02-01
Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Data and Tools | Concentrating Solar Power | NREL
download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and
Validation of high throughput sequencing and microbial forensics applications
2014-01-01
High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166
Validation of high throughput sequencing and microbial forensics applications.
Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel
2014-01-01
High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
SERS-based application in food analytics (Conference Presentation)
NASA Astrophysics Data System (ADS)
Cialla-May, Dana; Radu, Andreea; Jahn, Martin; Weber, Karina; Popp, Jürgen
2017-02-01
To establish detection schemes in life science applications, specific and sensitive methods allowing for fast detection times are required. Due to the interaction of molecules with strong electromagnetic fields excited at metallic nanostructures, the molecular fingerprint specific Raman spectrum is increased by several orders of magnitude. This effect is described as surface-enhanced Raman spectroscopy (SERS) and became a very powerful analytical tool in many fields of application. Within this presentation, we will introduce innovative bottom-up strategies to prepare SERS-active nanostructures coated with a lipophilic sensor layer. To do so, the food colorant Sudan III, an indirect carcinogen substance found in chili powder, palm oil or spice mixtures, is detected quantitatively in the background of the competitor riboflavin as well as paprika powder extracts. The SERS-based detection of azorubine (E122) in commercial available beverages with different complexity (e.g. sugar content, alcohol concentration) illustrates the strong potential of SERS as a qualitative as well as semiquantitative prescan method in food analytics. Here, a good agreement between the estimated concentration employing SERS as well as the gold standard technique HPLC, a highly laborious method, is found. Finally, SERS is applied to detect vitamin B2 and B12 in cereals as well as the estimate the ratio of lycopene and β-carotene in tomatoes. Acknowledgement: Funding the projects "QuantiSERS" and "Jenaer Biochip Initiative 2.0" within the framework "InnoProfile Transfer - Unternehmen Region" the Federal Ministry of Education and Research, Germany (BMBF) is gratefully acknowledged.
A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI
Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.
2016-01-01
Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244
Gómez Rioja, Rubén; Martínez Espartosa, Débora; Segovia, Marta; Ibarz, Mercedes; Llopis, María Antonia; Bauça, Josep Miquel; Marzana, Itziar; Barba, Nuria; Ventura, Monserrat; García Del Pino, Isabel; Puente, Juan José; Caballero, Andrea; Gómez, Carolina; García Álvarez, Ana; Alsina, María Jesús; Álvarez, Virtudes
2018-05-05
The stability limit of an analyte in a biological sample can be defined as the time required until a measured property acquires a bias higher than a defined specification. Many studies assessing stability and presenting recommendations of stability limits are available, but differences among them are frequent. The aim of this study was to classify and to grade a set of bibliographic studies on the stability of five common blood measurands and subsequently generate a consensus stability function. First, a bibliographic search was made for stability studies for five analytes in blood: alanine aminotransferase (ALT), glucose, phosphorus, potassium and prostate specific antigen (PSA). The quality of every study was evaluated using an in-house grading tool. Second, the different conditions of stability were uniformly defined and the percent deviation (PD%) over time for each analyte and condition were scattered while unifying studies with similar conditions. From the 37 articles considered as valid, up to 130 experiments were evaluated and 629 PD% data were included (106 for ALT, 180 for glucose, 113 for phosphorus, 145 for potassium and 85 for PSA). Consensus stability equations were established for glucose, potassium, phosphorus and PSA, but not for ALT. Time is the main variable affecting stability in medical laboratory samples. Bibliographic studies differ in recommedations of stability limits mainly because of different specifications for maximum allowable error. Definition of a consensus stability function in specific conditions can help laboratories define stability limits using their own quality specifications.
Mining patterns in persistent surveillance systems with smart query and visual analytics
NASA Astrophysics Data System (ADS)
Habibi, Mohammad S.; Shirkhodaie, Amir
2013-05-01
In Persistent Surveillance Systems (PSS) the ability to detect and characterize events geospatially help take pre-emptive steps to counter adversary's actions. Interactive Visual Analytic (VA) model offers this platform for pattern investigation and reasoning to comprehend and/or predict such occurrences. The need for identifying and offsetting these threats requires collecting information from diverse sources, which brings with it increasingly abstract data. These abstract semantic data have a degree of inherent uncertainty and imprecision, and require a method for their filtration before being processed further. In this paper, we have introduced an approach based on Vector Space Modeling (VSM) technique for classification of spatiotemporal sequential patterns of group activities. The feature vectors consist of an array of attributes extracted from generated sensors semantic annotated messages. To facilitate proper similarity matching and detection of time-varying spatiotemporal patterns, a Temporal-Dynamic Time Warping (DTW) method with Gaussian Mixture Model (GMM) for Expectation Maximization (EM) is introduced. DTW is intended for detection of event patterns from neighborhood-proximity semantic frames derived from established ontology. GMM with EM, on the other hand, is employed as a Bayesian probabilistic model to estimated probability of events associated with a detected spatiotemporal pattern. In this paper, we present a new visual analytic tool for testing and evaluation group activities detected under this control scheme. Experimental results demonstrate the effectiveness of proposed approach for discovery and matching of subsequences within sequentially generated patterns space of our experiments.
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
NASA Technical Reports Server (NTRS)
Kempler, Steve; Mathews, Tiffany
2016-01-01
The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
Physics of cosmological cascades and observable properties
NASA Astrophysics Data System (ADS)
Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.
2017-04-01
TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.
Hartzband, David; Jacobs, Feygele
2016-01-01
As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Data awareness, that is, an appreciation of the importance of data integrity, data hygiene 2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned
Hartzband, David; Jacobs, Feygele
2016-01-01
Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Conclusion Data awareness, that is, an appreciation of the importance of data integrity, data hygiene2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable. PMID:28210424
Analytical Approaches to Verify Food Integrity: Needs and Challenges.
Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M
2016-09-01
A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food industry has taken these many challenges aboard, working closely with all stakeholders and continuously communicating on progress in a fully transparent manner.
Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H
1984-01-01
The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.
Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions
2014-12-05
test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions
NASA Astrophysics Data System (ADS)
Nakadate, Hiromichi; Sekizuka, Eiichi; Minamitani, Haruyuki
We aimed to study the validity of a new analytical approach that reflected the phase from platelet activation to the formation of small platelet aggregates. We hoped that this new approach would enable us to use the particle-counting method with laser-light scattering to measure platelet aggregation in healthy controls and in diabetic patients without complications. We measured agonist-induced platelet aggregation for 10 min. Agonist was added to the platelet-rich plasma 1 min after measurement started. We compared the total scattered light intensity from small aggregates over a 10-min period (established analytical approach) and that over a 2-min period from 1 to 3 min after measurement started (new analytical approach). Consequently platelet aggregation in diabetics with HbA1c ≥ 6.5% was significantly greater than in healthy controls by both analytical approaches. However, platelet aggregation in diabetics with HbA1c < 6.5%, i.e. patients in the early stages of diabetes, was significantly greater than in healthy controls only by the new analytical approach, not by the established analytical approach. These results suggest that platelet aggregation as detected by the particle-counting method using laser-light scattering could be applied in clinical examinations by our new analytical approach.
We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...
Werber, D; Bernard, H
2014-02-27
Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.
Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.
Monteiro, Tiago; Almeida, Maria Gabriela
2018-05-14
Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.
Multivariable Hermite polynomials and phase-space dynamics
NASA Technical Reports Server (NTRS)
Dattoli, G.; Torre, Amalia; Lorenzutta, S.; Maino, G.; Chiccoli, C.
1994-01-01
The phase-space approach to classical and quantum systems demands for advanced analytical tools. Such an approach characterizes the evolution of a physical system through a set of variables, reducing to the canonically conjugate variables in the classical limit. It often happens that phase-space distributions can be written in terms of quadratic forms involving the above quoted variables. A significant analytical tool to treat these problems may come from the generalized many-variables Hermite polynomials, defined on quadratic forms in R(exp n). They form an orthonormal system in many dimensions and seem the natural tool to treat the harmonic oscillator dynamics in phase-space. In this contribution we discuss the properties of these polynomials and present some applications to physical problems.
2016-04-01
Disclaimer The views expressed in this academic research paper are those of the author and do not reflect the official policy or position of the US...10 Figure 2: Proposed MAT Rating Badges..............................................................................16...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and
Kumar, B. Vinodh; Mohan, Thuthi
2018-01-01
OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carr, Steven A.; Abbateillo, Susan E.; Ackermann, Bradley L.
2014-01-14
Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do notmore » contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this “fit-for-purpose” approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations. Molecular & Cellular Proteomics 13: 10.1074/mcp.M113.036095, 907–917, 2014.« less
Carr, Steven A; Abbatiello, Susan E; Ackermann, Bradley L; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W; Grant, Russell P; Hoofnagle, Andrew N; Hüttenhain, Ruth; Koomen, John M; Liebler, Daniel C; Liu, Tao; MacLean, Brendan; Mani, D R; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A; Burlingame, Alma L; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S H; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R; Townsend, R Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan
2014-03-01
Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this "fit-for-purpose" approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations.
Carr, Steven A.; Abbatiello, Susan E.; Ackermann, Bradley L.; Borchers, Christoph; Domon, Bruno; Deutsch, Eric W.; Grant, Russell P.; Hoofnagle, Andrew N.; Hüttenhain, Ruth; Koomen, John M.; Liebler, Daniel C.; Liu, Tao; MacLean, Brendan; Mani, DR; Mansfield, Elizabeth; Neubert, Hendrik; Paulovich, Amanda G.; Reiter, Lukas; Vitek, Olga; Aebersold, Ruedi; Anderson, Leigh; Bethem, Robert; Blonder, Josip; Boja, Emily; Botelho, Julianne; Boyne, Michael; Bradshaw, Ralph A.; Burlingame, Alma L.; Chan, Daniel; Keshishian, Hasmik; Kuhn, Eric; Kinsinger, Christopher; Lee, Jerry S.H.; Lee, Sang-Won; Moritz, Robert; Oses-Prieto, Juan; Rifai, Nader; Ritchie, James; Rodriguez, Henry; Srinivas, Pothur R.; Townsend, R. Reid; Van Eyk, Jennifer; Whiteley, Gordon; Wiita, Arun; Weintraub, Susan
2014-01-01
Adoption of targeted mass spectrometry (MS) approaches such as multiple reaction monitoring (MRM) to study biological and biomedical questions is well underway in the proteomics community. Successful application depends on the ability to generate reliable assays that uniquely and confidently identify target peptides in a sample. Unfortunately, there is a wide range of criteria being applied to say that an assay has been successfully developed. There is no consensus on what criteria are acceptable and little understanding of the impact of variable criteria on the quality of the results generated. Publications describing targeted MS assays for peptides frequently do not contain sufficient information for readers to establish confidence that the tests work as intended or to be able to apply the tests described in their own labs. Guidance must be developed so that targeted MS assays with established performance can be made widely distributed and applied by many labs worldwide. To begin to address the problems and their solutions, a workshop was held at the National Institutes of Health with representatives from the multiple communities developing and employing targeted MS assays. Participants discussed the analytical goals of their experiments and the experimental evidence needed to establish that the assays they develop work as intended and are achieving the required levels of performance. Using this “fit-for-purpose” approach, the group defined three tiers of assays distinguished by their performance and extent of analytical characterization. Computational and statistical tools useful for the analysis of targeted MS results were described. Participants also detailed the information that authors need to provide in their manuscripts to enable reviewers and readers to clearly understand what procedures were performed and to evaluate the reliability of the peptide or protein quantification measurements reported. This paper presents a summary of the meeting and recommendations. PMID:24443746
Code of Federal Regulations, 2010 CFR
2010-04-01
... TREASURY LIQUORS BEER Pilot Brewing Plants § 25.271 General. (a) Establishment. A person may establish and operate a pilot brewing plant off the brewery premises for research, analytical, experimental, or developmental purposes relating to beer or brewery operations. Pilot brewing plants will be established as...
Code of Federal Regulations, 2011 CFR
2011-04-01
... TREASURY LIQUORS BEER Pilot Brewing Plants § 25.271 General. (a) Establishment. A person may establish and operate a pilot brewing plant off the brewery premises for research, analytical, experimental, or developmental purposes relating to beer or brewery operations. Pilot brewing plants will be established as...
Tools for Educational Data Mining: A Review
ERIC Educational Resources Information Center
Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan
2017-01-01
In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.
Predictive Data Tools Find Uses in Schools
ERIC Educational Resources Information Center
Sparks, Sarah D.
2011-01-01
The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention. Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help…
A consumer guide: tools to manage vegetation and fuels.
David L. Peterson; Louisa Evers; Rebecca A. Gravenmier; Ellen Eberhardt
2007-01-01
Current efforts to improve the scientific basis for fire management on public lands will benefit from more efficient transfer of technical information and tools that support planning, implementation, and effectiveness of vegetation and hazardous fuel treatments. The technical scope, complexity, and relevant spatial scale of analytical and decision support tools differ...
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig
2018-05-17
Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
Comprehensive characterizations of nanoparticle biodistribution following systemic injection in mice
NASA Astrophysics Data System (ADS)
Liao, Wei-Yin; Li, Hui-Jing; Chang, Ming-Yao; Tang, Alan C. L.; Hoffman, Allan S.; Hsieh, Patrick C. H.
2013-10-01
Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics.Various nanoparticle (NP) properties such as shape and surface charge have been studied in an attempt to enhance the efficacy of NPs in biomedical applications. When trying to undermine the precise biodistribution of NPs within the target organs, the analytical method becomes the determining factor in measuring the precise quantity of distributed NPs. High performance liquid chromatography (HPLC) represents a more powerful tool in quantifying NP biodistribution compared to conventional analytical methods such as an in vivo imaging system (IVIS). This, in part, is due to better curve linearity offered by HPLC than IVIS. Furthermore, HPLC enables us to fully analyze each gram of NPs present in the organs without compromising the signals and the depth-related sensitivity as is the case in IVIS measurements. In addition, we found that changing physiological conditions improved large NP (200-500 nm) distribution in brain tissue. These results reveal the importance of selecting analytic tools and physiological environment when characterizing NP biodistribution for future nanoscale toxicology, therapeutics and diagnostics. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03954d
ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.
Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa
2016-05-01
The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Constraint-Referenced Analytics of Algebra Learning
ERIC Educational Resources Information Center
Sutherland, Scot M.; White, Tobin F.
2016-01-01
The development of the constraint-referenced analytics tool for monitoring algebra learning activities presented here came from the desire to firstly, take a more quantitative look at student responses in collaborative algebra activities, and secondly, to situate those activities in a more traditional introductory algebra setting focusing on…
Towards an Analytic Foundation for Network Architecture
2010-12-31
SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
Entanglement spectrum degeneracy and the Cardy formula in 1+1 dimensional conformal field theories
NASA Astrophysics Data System (ADS)
Alba, Vincenzo; Calabrese, Pasquale; Tonni, Erik
2018-01-01
We investigate the effect of a global degeneracy in the distribution of the entanglement spectrum in conformal field theories in one spatial dimension. We relate the recently found universal expression for the entanglement Hamiltonian to the distribution of the entanglement spectrum. The main tool to establish this connection is the Cardy formula. It turns out that the Affleck-Ludwig non-integer degeneracy, appearing because of the boundary conditions induced at the entangling surface, can be directly read from the entanglement spectrum distribution. We also clarify the effect of the non-integer degeneracy on the spectrum of the partial transpose, which is the central object for quantifying the entanglement in mixed states. We show that the exact knowledge of the entanglement spectrum in some integrable spin-chains provides strong analytical evidences corroborating our results.
Assessment of Trading Partners for China's Rare Earth Exports Using a Decision Analytic Approach
He, Chunyan; Lei, Yalin; Ge, Jianping
2014-01-01
Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies. PMID:25051534
Palvannan, R Kannapiran; Teow, Kiok Liang
2012-04-01
Patient queues are prevalent in healthcare and wait time is one measure of access to care. We illustrate Queueing Theory-an analytical tool that has provided many insights to service providers when designing new service systems and managing existing ones. This established theory helps us to quantify the appropriate service capacity to meet the patient demand, balancing system utilization and the patient's wait time. It considers four key factors that affect the patient's wait time: average patient demand, average service rate and the variation in both. We illustrate four basic insights that will be useful for managers and doctors who manage healthcare delivery systems, at hospital or department level. Two examples from local hospitals are shown where we have used queueing models to estimate the service capacity and analyze the impact of capacity configurations, while considering the inherent variation in healthcare.
Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo
2017-09-01
Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.
Assessment of trading partners for China's rare earth exports using a decision analytic approach.
He, Chunyan; Lei, Yalin; Ge, Jianping
2014-01-01
Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies.
Where midwives are not yet recognised: a feasibility study of professional midwives in Nepal.
Bogren, Malin Upper; van Teijlingen, Edwin; Berg, Marie
2013-10-01
the professional midwife is a key person for promoting maternal and family health. Not all countries have yet reached the professional standard for midwives set by the International Confederation of Midwives (ICM) and Nepal is one of these countries. This study explores the feasibility to establish a professional midwifery cadre in Nepal that meets the global standards of competencies, and to define a strategy to reach this. a mixed-methods study comprised (1) policy-review (2) interviews and (3) observations. An assessment tool was designed for data collection and analysis using variables from three sources: ICM's Global Standards, the skilled birth attendant programme in Nepal, and JHPIEGO's site assessment tool for maternal health and new-born programmes. Data were collected in a desk review of education and policy documents, interviews with stakeholders, and site assessment of five higher education institutions and their hospital-based maternity departments. The analysis resulted in a recommended strategy. six levels of education of nurse staff providing midwifery care were identified; all regulated under the Nepal Nursing Council. No legislation was in place authorising midwifery as an autonomous profession. A post-basic midwifery programme on first cycle-bachelor level was under development. A well-organised midwifery association was established consisting of nurses providing maternal health care. Four university colleges offering higher education for nurses and clinicians had a capability to run a midwifery programme and the fifth had a genuine interest in starting a midwifery programme at bachelor level. The proposed strategy includes four strategic objectives and interventions in relation to four components identified by UNFPA: Legislation and regulation; Training and education; Deployment and utilisation; and Professional associations. the study has delivered a proposed strategy for the Government of Nepal for effective management of the midwifery workforce in order to enhance midwives' contribution in maternity care and thus promoting improved maternal and new-born health. The developed analytical framework could be used as an assessment tool also in other countries to establish professional midwifery cadres that meets the global standards of competencies. © 2013 Elsevier Ltd. All rights reserved.
Initiating an Online Reputation Monitoring System with Open Source Analytics Tools
NASA Astrophysics Data System (ADS)
Shuhud, Mohd Ilias M.; Alwi, Najwa Hayaati Md; Halim, Azni Haslizan Abd
2018-05-01
Online reputation is an invaluable asset for modern organizations as it can help in business performance especially in sales and profit. However, if we are not aware of our reputation, it is difficult to maintain it. Thus, social media analytics is a new tool that can provide online reputation monitoring in various ways such as sentiment analysis. As a result, numerous large-scale organizations have implemented Online Reputation Monitoring (ORM) systems. However, this solution is not supposed to be exclusively for high-income organizations, as many organizations regardless sizes and types are now online. This research attempts to propose an affordable and reliable ORM system using combination of open source analytics tools for both novice practitioners and academicians. We also evaluate its prediction accuracy and we discovered that the system provides acceptable predictions (sixty percent accuracy) and demonstrate a tally prediction of major polarity by human annotation. The proposed system can help in supporting business decisions with flexible monitoring strategies especially for organization that want to initiate and administrate ORM themselves at low cost.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
3D FEM Simulation of Flank Wear in Turning
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta; Giardini, Claudio
2011-05-01
This work deals with tool wear simulation. Studying the influence of tool wear on tool life, tool substitution policy and influence on final part quality, surface integrity, cutting forces and power consumption it is important to reduce the global process costs. Adhesion, abrasion, erosion, diffusion, corrosion and fracture are some of the phenomena responsible of the tool wear depending on the selected cutting parameters: cutting velocity, feed rate, depth of cut, …. In some cases these wear mechanisms are described by analytical models as a function of process variables (temperature, pressure and sliding velocity along the cutting surface). These analytical models are suitable to be implemented in FEM codes and they can be utilized to simulate the tool wear. In the present paper a commercial 3D FEM software has been customized to simulate the tool wear during turning operations when cutting AISI 1045 carbon steel with uncoated tungsten carbide tip. The FEM software was improved by means of a suitable subroutine able to modify the tool geometry on the basis of the estimated tool wear as the simulation goes on. Since for the considered couple of tool-workpiece material the main phenomena generating wear are the abrasive and the diffusive ones, the tool wear model implemented into the subroutine was obtained as combination between the Usui's and the Takeyama and Murata's models. A comparison between experimental and simulated flank tool wear curves is reported demonstrating that it is possible to simulate the tool wear development.
Stream Lifetimes Against Planetary Encounters
NASA Technical Reports Server (NTRS)
Valsecchi, G. B.; Lega, E.; Froeschle, Cl.
2011-01-01
We study, both analytically and numerically, the perturbation induced by an encounter with a planet on a meteoroid stream. Our analytical tool is the extension of pik s theory of close encounters, that we apply to streams described by geocentric variables. The resulting formulae are used to compute the rate at which a stream is dispersed by planetary encounters into the sporadic background. We have verified the accuracy of the analytical model using a numerical test.
Poulos, Natalie S; Pasch, Keryn E
2015-07-01
Few studies of the food environment have collected primary data, and even fewer have reported reliability of the tool used. This study focused on the development of an innovative electronic data collection tool used to document outdoor food and beverage (FB) advertising and establishments near 43 middle and high schools in the Outdoor MEDIA Study. Tool development used GIS based mapping, an electronic data collection form on handheld devices, and an easily adaptable interface to efficiently collect primary data within the food environment. For the reliability study, two teams of data collectors documented all FB advertising and establishments within one half-mile of six middle schools. Inter-rater reliability was calculated overall and by advertisement or establishment category using percent agreement. A total of 824 advertisements (n=233), establishment advertisements (n=499), and establishments (n=92) were documented (range=8-229 per school). Overall inter-rater reliability of the developed tool ranged from 69-89% for advertisements and establishments. Results suggest that the developed tool is highly reliable and effective for documenting the outdoor FB environment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Poulos, Natalie S.; Pasch, Keryn E.
2015-01-01
Few studies of the food environment have collected primary data, and even fewer have reported reliability of the tool used. This study focused on the development of an innovative electronic data collection tool used to document outdoor food and beverage (FB) advertising and establishments near 43 middle and high schools in the Outdoor MEDIA Study. Tool development used GIS based mapping, an electronic data collection form on handheld devices, and an easily adaptable interface to efficiently collect primary data within the food environment. For the reliability study, two teams of data collectors documented all FB advertising and establishments within one half-mile of six middle schools. Inter-rater reliability was calculated overall and by advertisement or establishment category using percent agreement. A total of 824 advertisements (n=233), establishment advertisements (n=499), and establishments (n=92) were documented (range=8–229 per school). Overall inter-rater reliability of the developed tool ranged from 69–89% for advertisements and establishments. Results suggest that the developed tool is highly reliable and effective for documenting the outdoor FB environment. PMID:26022774
Comparative analytics of infusion pump data across multiple hospital systems.
Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith
2015-02-15
A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children
Lee, Hye Ryun; Roh, Eun Youn; Chang, Ju Young
2016-01-01
Background Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. Methods A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. Results As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. Conclusions We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age. PMID:27374715
Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children.
Lee, Hye Ryun; Shin, Sue; Yoon, Jong Hyun; Roh, Eun Youn; Chang, Ju Young
2016-09-01
Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Factor analytic tools such as principal component analysis (PCA) and positive matrix factorization (PMF), suffer from rotational ambiguity in the results: different solutions (factors) provide equally good fits to the measured data. The PMF model imposes non-negativity of both...
Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.
ERIC Educational Resources Information Center
Kaya, Azmi
1982-01-01
Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…
A thermal biosensor based on enzyme reaction.
Zheng, Yi-Hua; Hua, Tse-Chao; Xu, Fei
2005-01-01
Application of the thermal biosensor as analytical tool is promising due to advantages as universal, simplicity and quick response. A novel thermal biosensor based on enzyme reaction has been developed. This biosensor is a flow injection analysis system and consists of two channels with enzyme reaction column and reference column. The reference column, which is set for eliminating the unspecific heat, is inactived on special enzyme reaction of the ingredient to be detected. The special enzyme reaction takes places in the enzyme reaction column at a constant temperature realizing by a thermoelectric thermostat. Thermal sensor based on the thermoelectric module containing 127 serial BiTe-thermocouples is used to monitor the temperature difference between two streams from the enzyme reaction column and the reference column. The analytical example for dichlorvos shows that this biosensor can be used as analytical tool in medicine and biology.
Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo
Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.
2015-01-01
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028
Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A
2015-02-24
Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools
ERIC Educational Resources Information Center
Yang, Min; Wong, Stephen C. P.; Coid, Jeremy
2010-01-01
Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2017-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
Data Visualization: An Exploratory Study into the Software Tools Used by Businesses
ERIC Educational Resources Information Center
Diamond, Michael; Mattia, Angela
2015-01-01
Data visualization is a key component to business and data analytics, allowing analysts in businesses to create tools such as dashboards for business executives. Various software packages allow businesses to create these tools in order to manipulate data for making informed business decisions. The focus is to examine what skills employers are…
76 FR 52915 - Periodic Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... proposed changes in certain analytical methods used in periodic reporting. The proposed changes are... assignment of certain flat sorting operations; bias in mixed mail tallies; and Express Mail. Establishing... consider changes in the analytical methods approved for use in periodic reporting.\\1\\ \\1\\ Petition of the...
Strategic, Analytic and Operational Domains of Information Management.
ERIC Educational Resources Information Center
Diener, Richard AV
1992-01-01
Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…
Process monitoring and visualization solutions for hot-melt extrusion: a review.
Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas
2014-02-01
Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.
METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH
Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...
Analytical tools for identifying bicycle route suitability, coverage, and continuity.
DOT National Transportation Integrated Search
2012-05-01
This report presents new tools created to assess bicycle suitability using geographic information systems (GIS). Bicycle suitability is a rating of how appropriate a roadway is for bicycle travel based on attributes of the roadway, such as vehi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groen, E.A., E-mail: Evelyne.Groen@gmail.com; Heijungs, R.; Leiden University, Einsteinweg 2, Leiden 2333 CC
Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlationsmore » between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judd, Kathleen S.; Judd, Chaeli; Engel-Cox, Jill A.
This report presents the results of the Gulf of Mexico Regional Collaborative (GoMRC), a year-long project funded by NASA. The GoMRC project was organized around end user outreach activities, a science applications team, and a team for information technology (IT) development. Key outcomes are summarized below for each of these areas. End User Outreach; Successfully engaged federal and state end users in project planning and feedback; With end user input, defined needs and system functional requirements; Conducted demonstration to End User Advisory Committee on July 9, 2007 and presented at Gulf of Mexico Alliance (GOMA) meeting of Habitat Identification committee;more » Conducted significant engagement of other end user groups, such as the National Estuary Programs (NEP), in the Fall of 2007; Established partnership with SERVIR and Harmful Algal Blooms Observing System (HABSOS) programs and initiated plan to extend HABs monitoring and prediction capabilities to the southern Gulf; Established a science and technology working group with Mexican institutions centered in the State of Veracruz. Key team members include the Federal Commission for the Protection Against Sanitary Risks (COFEPRIS), the Ecological Institute (INECOL) a unit of the National Council for science and technology (CONACYT), the Veracruz Aquarium (NOAA’s first international Coastal Ecology Learning Center) and the State of Veracruz. The Mexican Navy (critical to coastal studies in the Southern Gulf) and other national and regional entities have also been engaged; and Training on use of SERVIR portal planned for Fall 2007 in Veracruz, Mexico Science Applications; Worked with regional scientists to produce conceptual models of submerged aquatic vegetation (SAV) ecosystems; Built a logical framework and tool for ontological modeling of SAV and HABs; Created online guidance for SAV restoration planning; Created model runs which link potential future land use trends, runoff and SAV viability; Analyzed SAV cover change at five other bays in the Gulf of Mexico to demonstrate extensibility of the analytical tools; and Initiated development of a conceptual model for understanding the causes and effects of HABs in the Gulf of Mexico IT Tool Development; Established a website with the GoMRC web-based tools at www.gomrc.org; Completed development of an ArcGIS-based decision support tool for SAV restoration prioritization decisions, and demonstrated its use in Mobile Bay; Developed a web-based application, called Conceptual Model Explorer (CME), that enables non-GIS users to employ the prioritization model for SAV restoration; Created CME tool enabling scientists to view existing, and create new, ecosystem conceptual models which can be used to document cause-effect relationships within coastal ecosystems, and offer guidance on management solutions; Adapted the science-driven advanced web search engine, Noesis, to focus on an initial set of coastal and marine resource issues, including SAV and HABs; Incorporated map visualization tools with initial data layers related to coastal wetlands and SAVs; and Supported development of a SERVIR portal for data management and visualization in the southern Gulf of Mexico, as well as training of end users in Mexican Gulf States.« less
Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data
NASA Astrophysics Data System (ADS)
Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.
2014-12-01
Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.
Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.
2016-01-01
The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.
Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter
2017-01-01
We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454
40 CFR 60.58c - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... conducted to determine compliance with the emissions limits and/or to establish or re-establish operating... were established or re-established, if applicable. (7) All documentation produced as a result of the... Reporting Tool located at http://www.epa.gov/ttn/chief/ert/ert_tool.html. [62 FR 48382, Sept. 15, 1997, as...
40 CFR 60.58c - Reporting and recordkeeping requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... conducted to determine compliance with the emissions limits and/or to establish or re-establish operating... were established or re-established, if applicable. (7) All documentation produced as a result of the... Reporting Tool located at http://www.epa.gov/ttn/chief/ert/ert_tool.html. [62 FR 48382, Sept. 15, 1997, as...
NASA Astrophysics Data System (ADS)
Rohland, Stefanie; Pfurtscheller, Clemens; Seebauer, Sebastian
2016-04-01
Keywords: private preparedness, property protection, flood, heavy rains, Transtheoretical Model, evaluation of methods and tools Experiences in Europe and Austria from coping with numerous floods and heavy rain events in recent decades point to room for improvement in reducing damages and adverse effects. One of the emerging issues is private preparedness, which has only received punctual attention in Austria until now. Current activities to promote property protection are, however, not underpinned by a long-term strategy, thus minimizing their cumulative effect. While printed brochures and online information are widely available, innovative information services, tailored to and actively addressing specific target groups, are thin on the ground. This project reviews (national as well as international) established approaches, with a focus on German-speaking areas, checking their long-term effectiveness with the help of expert workshops and an empirical analysis of survey data. The Transtheoretical Model (Prochaska, 1977) serves as the analytical framework: We assign specific tools to distinct stages of behavioural change. People's openness to absorb risk information or their willingness to engage in private preparedness depend on an incremental process of considering, appraising, introducing and finally maintaining preventive actions. Based on this stage-specific perspective and the workshop results, gaps of intervention are identified to define best-practice examples and recommendations that can be realized within the prevailing legislative and organizational framework at national, regional and local level in Austria.
Sigma Metrics Across the Total Testing Process.
Charuruks, Navapun
2017-03-01
Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
A behavior-analytic critique of Bandura's self-efficacy theory
Biglan, Anthony
1987-01-01
A behavior-analytic critique of self-efficacy theory is presented. Self-efficacy theory asserts that efficacy expectations determine approach behavior and physiological arousal of phobics as well as numerous other clinically important behaviors. Evidence which is purported to support this assertion is reviewed. The evidence consists of correlations between self-efficacy ratings and other behaviors. Such response-response relationships do not unequivocally establish that one response causes another. A behavior-analytic alternative to self-efficacy theory explains these relationships in terms of environmental events. Correlations between self-efficacy rating behavior and other behavior may be due to the contingencies of reinforcement that establish a correspondence between such verbal predictions and the behavior to which they refer. Such a behavior-analytic account does not deny any of the empirical relationships presented in support of self-efficacy theory, but it points to environmental variables that could account for those relationships and that could be manipulated in the interest of developing more effective treatment procedures. PMID:22477956
Propellant Chemistry for CFD Applications
NASA Technical Reports Server (NTRS)
Farmer, R. C.; Anderson, P. G.; Cheng, Gary C.
1996-01-01
Current concepts for reusable launch vehicle design have created renewed interest in the use of RP-1 fuels for high pressure and tri-propellant propulsion systems. Such designs require the use of an analytical technology that accurately accounts for the effects of real fluid properties, combustion of large hydrocarbon fuel modules, and the possibility of soot formation. These effects are inadequately treated in current computational fluid dynamic (CFD) codes used for propulsion system analyses. The objective of this investigation is to provide an accurate analytical description of hydrocarbon combustion thermodynamics and kinetics that is sufficiently computationally efficient to be a practical design tool when used with CFD codes such as the FDNS code. A rigorous description of real fluid properties for RP-1 and its combustion products will be derived from the literature and from experiments conducted in this investigation. Upon the establishment of such a description, the fluid description will be simplified by using the minimum of empiricism necessary to maintain accurate combustion analyses and including such empirical models into an appropriate CFD code. An additional benefit of this approach is that the real fluid properties analysis simplifies the introduction of the effects of droplet sprays into the combustion model. Typical species compositions of RP-1 have been identified, surrogate fuels have been established for analyses, and combustion and sooting reaction kinetics models have been developed. Methods for predicting the necessary real fluid properties have been developed and essential experiments have been designed. Verification studies are in progress, and preliminary results from these studies will be presented. The approach has been determined to be feasible, and upon its completion the required methodology for accurate performance and heat transfer CFD analyses for high pressure, tri-propellant propulsion systems will be available.
A study on building data warehouse of hospital information system.
Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo
2011-08-01
Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.
Patent databases and analytical tools for space technology commercialization (Part 2)
NASA Astrophysics Data System (ADS)
Hulsey, William N., III
2002-07-01
A shift in the space industry has occurred that requires technology developers to understand the basics of the intellectual property laws; Global harmonization facilitates this understanding; internet-based tools enable knowledge of these rights and the facts affecting them.
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
Integration of bus stop counts data with census data for improving bus service.
DOT National Transportation Integrated Search
2016-04-01
This research project produced an open source transit market data visualization and analysis tool suite, : The Bus Transit Market Analyst (BTMA), which contains user-friendly GIS mapping and data : analytics tools, and state-of-the-art transit demand...
Redox-capacitor to connect electrochemistry to redox-biology.
Kim, Eunkyoung; Leverage, W Taylor; Liu, Yi; White, Ian M; Bentley, William E; Payne, Gregory F
2014-01-07
It is well-established that redox-reactions are integral to biology for energy harvesting (oxidative phosphorylation), immune defense (oxidative burst) and drug metabolism (phase I reactions), yet there is emerging evidence that redox may play broader roles in biology (e.g., redox signaling). A critical challenge is the need for tools that can probe biologically-relevant redox interactions simply, rapidly and without the need for a comprehensive suite of analytical methods. We propose that electrochemistry may provide such a tool. In this tutorial review, we describe recent studies with a redox-capacitor film that can serve as a bio-electrode interface that can accept, store and donate electrons from mediators commonly used in electrochemistry and also in biology. Specifically, we (i) describe the fabrication of this redox-capacitor from catechols and the polysaccharide chitosan, (ii) discuss the mechanistic basis for electron exchange, (iii) illustrate the properties of this redox-capacitor and its capabilities for promoting redox-communication between biology and electrodes, and (iv) suggest the potential for enlisting signal processing strategies to "extract" redox information. We believe these initial studies indicate broad possibilities for enlisting electrochemistry and signal processing to acquire "systems level" redox information from biology.
Aßmann, C
2016-06-01
Besides large efforts regarding field work, provision of valid databases requires statistical and informational infrastructure to enable long-term access to longitudinal data sets on height, weight and related issues. To foster use of longitudinal data sets within the scientific community, provision of valid databases has to address data-protection regulations. It is, therefore, of major importance to hinder identifiability of individuals from publicly available databases. To reach this goal, one possible strategy is to provide a synthetic database to the public allowing for pretesting strategies for data analysis. The synthetic databases can be established using multiple imputation tools. Given the approval of the strategy, verification is based on the original data. Multiple imputation by chained equations is illustrated to facilitate provision of synthetic databases as it allows for capturing a wide range of statistical interdependencies. Also missing values, typically occurring within longitudinal databases for reasons of item non-response, can be addressed via multiple imputation when providing databases. The provision of synthetic databases using multiple imputation techniques is one possible strategy to ensure data protection, increase visibility of longitudinal databases and enhance the analytical potential.
Thomopoulos, N; Grant-Muller, S; Tight, M R
2009-11-01
Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.
Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas
2017-01-01
Summary Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however – from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining – is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools. PMID:26536290
Wittmann, Marion E.; Jerde, Christopher L.; Howeth, Jennifer G.; Maher, Sean P.; Deines, Andrew M.; Jenkins, Jill A.; Whitledge, Gregory W.; Burbank, Sarah B.; Chadderton, William L.; Mahon, Andrew R.; Tyson, Jeffrey T.; Gantz, Crysta A.; Keller, Reuben P.; Drake, John M.; Lodge, David M.
2014-01-01
Intentional introductions of nonindigenous fishes are increasing globally. While benefits of these introductions are easily quantified, assessments to understand the negative impacts to ecosystems are often difficult, incomplete, or absent. Grass carp (Ctenopharyngodon idella) was originally introduced to the United States as a biocontrol agent, and recent observations of wild, diploid individuals in the Great Lakes basin have spurred interest in re-evaluating its ecological risk. Here, we evaluate the ecological impact of grass carp using expert opinion and a suite of the most up-to-date analytical tools and data (ploidy assessment, eDNA surveillance, species distribution models (SDMs), and meta-analysis). The perceived ecological impact of grass carp by fisheries experts was variable, ranging from unknown to very high. Wild-caught triploid and diploid individuals occurred in multiple Great Lakes waterways, and eDNA surveillance suggests that grass carp are abundant in a major tributary of Lake Michigan. SDMs predicted suitable grass carp climate occurs in all Great Lakes. Meta-analysis showed that grass carp introductions impact both water quality and biota. Novel findings based on updated ecological impact assessment tools indicate that iterative risk assessment of introduced fishes may be warranted.
Proteomics in food: Quality, safety, microbes, and allergens.
Piras, Cristian; Roncada, Paola; Rodrigues, Pedro M; Bonizzi, Luigi; Soggiu, Alessio
2016-03-01
Food safety and quality and their associated risks pose a major concern worldwide regarding not only the relative economical losses but also the potential danger to consumer's health. Customer's confidence in the integrity of the food supply could be hampered by inappropriate food safety measures. A lack of measures and reliable assays to evaluate and maintain a good control of food characteristics may affect the food industry economy and shatter consumer confidence. It is imperative to create and to establish fast and reliable analytical methods that allow a good and rapid analysis of food products during the whole food chain. Proteomics can represent a powerful tool to address this issue, due to its proven excellent quantitative and qualitative drawbacks in protein analysis. This review illustrates the applications of proteomics in the past few years in food science focusing on food of animal origin with some brief hints on other types. Aim of this review is to highlight the importance of this science as a valuable tool to assess food quality and safety. Emphasis is also posed in food processing, allergies, and possible contaminants like bacteria, fungi, and other pathogens. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Julich, S.; Kopinč, R.; Hlawatsch, N.; Moche, C.; Lapanje, A.; Gärtner, C.; Tomaso, H.
2014-05-01
Lab-on-a-chip systems are innovative tools for the detection and identification of microbial pathogens in human and veterinary medicine. The major advantages are small sample volume and a compact design. Several fluidic modules have been developed to transform analytical procedures into miniaturized scale including sampling, sample preparation, target enrichment, and detection procedures. We present evaluation data for single modules that will be integrated in a chip system for the detection of pathogens. A microfluidic chip for purification of nucleic acids was established for cell lysis using magnetic beads. This assay was evaluated with spiked environmental aerosol and swab samples. Bacillus thuringiensis was used as simulant for Bacillus anthracis, which is closely related but non-pathogenic for humans. Stationary PCR and a flow-through PCR chip module were investigated for specific detection of six highly pathogenic bacteria. The conventional PCR assays could be transferred into miniaturized scale using the same temperature/time profile. We could demonstrate that the microfluidic chip modules are suitable for the respective purposes and are promising tools for the detection of bacterial pathogens. Future developments will focus on the integration of these separate modules to an entire lab-on-a-chip system.
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
Making Sense of Game-Based User Data: Learning Analytics in Applied Games
ERIC Educational Resources Information Center
Steiner, Christina M.; Kickmeier-Rus, Michael D.; Albert, Dietrich
2015-01-01
Digital learning games are useful educational tools with high motivational potential. With the application of games for instruction there comes the need of acknowledging learning game experiences also in the context of educational assessment. Learning analytics provides new opportunities for supporting assessment in and of educational games. We…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Errichello, Robert
2013-08-29
An analytical model is developed to evaluate the design of a spline coupling. For a given torque and shaft misalignment, the model calculates the number of teeth in contact, tooth loads, stiffnesses, stresses, and safety factors. The analytic model provides essential spline coupling design and modeling information and could be easily integrated into gearbox design and simulation tools.
Challenges of Using Learning Analytics Techniques to Support Mobile Learning
ERIC Educational Resources Information Center
Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide
2015-01-01
Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…
A Simpli ed, General Approach to Simulating from Multivariate Copula Functions
Barry Goodwin
2012-01-01
Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...
Design and Implementation of a Learning Analytics Toolkit for Teachers
ERIC Educational Resources Information Center
Dyckhoff, Anna Lea; Zielke, Dennis; Bultmann, Mareike; Chatti, Mohamed Amine; Schroeder, Ulrik
2012-01-01
Learning Analytics can provide powerful tools for teachers in order to support them in the iterative process of improving the effectiveness of their courses and to collaterally enhance their students' performance. In this paper, we present the theoretical background, design, implementation, and evaluation details of eLAT, a Learning Analytics…
Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...
ERIC Educational Resources Information Center
McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta
2013-01-01
Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…
Survey of Network Visualization Tools
2007-12-01
Dimensionality • 2D Comments: Deployment Type: • Components for tool building • Standalone Tool OS: • Windows Extensibility • ActiveX ...Visual Basic Comments: Interoperability Daisy is fully compliant with Microsoft’s ActiveX , therefore, other Windows based programs can...other functions that improve analytic decision making. Available in ActiveX , C++, Java, and .NET editions. • Tom Sawyer Visualization: Enables you to
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
A study of dynamic SIMS analysis of low-k dielectric materials
NASA Astrophysics Data System (ADS)
Mowat, Ian A.; Lin, Xue-Feng; Fister, Thomas; Kendall, Marius; Chao, Gordon; Yang, Ming Hong
2006-07-01
Dynamic SIMS is an established tool for the characterization of dielectric layers in semiconductors, both for contaminant levels and for composition. As the silicon-based semiconductor industry moves towards the use of copper rather than aluminum, there is also a need to use lower k-dielectric materials to reduce RC delays and to reduce cross-talk between closely spaced metal lines. New dielectric materials pose serious challenges for implementation into semiconductor processes and also for the analytical scientist doing measurements on them. The move from inorganic materials such as SiO 2 to organic or carbon-rich low-k materials is a large change for the SIMS analyst. Low-k dielectric films from different sources can be very different materials with different analytical issues. A SIMS challenge for these materials is dealing with their insulating nature and their also fragility, particularly for porous films. These materials can be extremely sensitive to electron beam damage during charge neutralization, leading to difficulties in determining depth scales and introducing unknown errors to secondary ion counts and their subsequent conversion to concentrations. This paper presents details regarding an investigation of the effects of electron beam exposure on a low-k material. These effects and their potential impact on SIMS data will be investigated using FT-IR, TOF-SIMS, AFM and stylus profilometry.
"EMERGING" POLLUTANTS, AND COMMUNICATING THE ...
This paper weaves a rnulti-dimensioned perspective of mass spectrometry as a career against the backdrop of mass spectrometry's key role in the past and future of environmental chemistry. Along the way, some insights are offered for better focusing the spotlight on the discipline of mass spectrometry. A Foundation for Environmental Science-Mass Spectrometry Historically fundamental to our understanding of environmental processes and chemical pollution is mass spectrometry. This branch of analytical chemistry is the workhorse which supplies much of the definitive data to environmental scientists and engineers for identifying the molecular compositions, and ultimately the structures, of chemicals. This is not to ignore the complementary and critical roles played by the adjunct practices of sample enrichment (e.g., to lower method detection limits via any of various means of selective extraction) and analyte separation (e.g., to lessen contaminant interferences via the myriad forms of chromatography and electrophoresis). While the power of mass spectrometry has long been highly visible to the practicing environmental chemist, it borders on continued obscurity to the lay public and most non-chemists. Even though mass spectrometry has played a long, historic and Largely invisible role in establishing or undergirding our existing knowledge about environmental processes and pollution, what recognition it does enjoy is usually relegated to that of a tool. It is usually
Transient hydrodynamic finite-size effects in simulations under periodic boundary conditions
NASA Astrophysics Data System (ADS)
Asta, Adelchi J.; Levesque, Maximilien; Vuilleumier, Rodolphe; Rotenberg, Benjamin
2017-06-01
We use lattice-Boltzmann and analytical calculations to investigate transient hydrodynamic finite-size effects induced by the use of periodic boundary conditions. These effects are inevitable in simulations at the molecular, mesoscopic, or continuum levels of description. We analyze the transient response to a local perturbation in the fluid and obtain the local velocity correlation function via linear response theory. This approach is validated by comparing the finite-size effects on the steady-state velocity with the known results for the diffusion coefficient. We next investigate the full time dependence of the local velocity autocorrelation function. We find at long times a crossover between the expected t-3 /2 hydrodynamic tail and an oscillatory exponential decay, and study the scaling with the system size of the crossover time, exponential rate and amplitude, and oscillation frequency. We interpret these results from the analytic solution of the compressible Navier-Stokes equation for the slowest modes, which are set by the system size. The present work not only provides a comprehensive analysis of hydrodynamic finite-size effects in bulk fluids, which arise regardless of the level of description and simulation algorithm, but also establishes the lattice-Boltzmann method as a suitable tool to investigate such effects in general.
Garelnabi, Mahdi; Litvinov, Dmitry; Parthasarathy, Sampath
2010-01-01
Background: Azelaic acid (AzA) is the best known dicarboxilic acid to have pharmaceutical benefits and clinical applications and also to be associated with some diseases pathophysiology. Materials and Methods: We extracted and methylesterified AzA and determined its concentration in human plasma obtained from healthy individuals and also in mice fed AzA containing diet for three months. Results: AzA was detected in Gas Chromatography (GC) and confirmed by Liquid chromatography mass spectrometry (LCMS), and gas chromatography mass spectrometry (GCMC). Our results have shown that AzA can be determined efficiently in selected biological samples by GC method with 1nM limit of detection (LoD) and the limit of quantification (LoQ); was established at 50nM. Analytical Sensitivity as assayed by hexane demonstrated an analytical sensitivity at 0.050nM. The method has demonstrated 8-10% CV batch repeatability across the sample types and 13-18.9% CV for the Within-Lab Precision analysis. The method has shown that AzA can efficiently be recovered from various sample preparation including liver tissue homogenate (95%) and human plasma (97%). Conclusions: Because of its simplicity and lower limit of quantification, the present method provides a useful tool for determining AzA in various biological sample preparations. PMID:22558586
Garelnabi, Mahdi; Litvinov, Dmitry; Parthasarathy, Sampath
2010-09-01
Azelaic acid (AzA) is the best known dicarboxilic acid to have pharmaceutical benefits and clinical applications and also to be associated with some diseases pathophysiology. We extracted and methylesterified AzA and determined its concentration in human plasma obtained from healthy individuals and also in mice fed AzA containing diet for three months. AzA was detected in Gas Chromatography (GC) and confirmed by Liquid chromatography mass spectrometry (LCMS), and gas chromatography mass spectrometry (GCMC). Our results have shown that AzA can be determined efficiently in selected biological samples by GC method with 1nM limit of detection (LoD) and the limit of quantification (LoQ); was established at 50nM. Analytical Sensitivity as assayed by hexane demonstrated an analytical sensitivity at 0.050nM. The method has demonstrated 8-10% CV batch repeatability across the sample types and 13-18.9% CV for the Within-Lab Precision analysis. The method has shown that AzA can efficiently be recovered from various sample preparation including liver tissue homogenate (95%) and human plasma (97%). Because of its simplicity and lower limit of quantification, the present method provides a useful tool for determining AzA in various biological sample preparations.
Experimental and Analytical Determination of the Geometric Far Field for Round Jets
NASA Technical Reports Server (NTRS)
Koch, L. Danielle; Bridges, James E.; Brown, Clifford E.; Khavaran, Abbas
2005-01-01
An investigation was conducted at the NASA Glenn Research Center using a set of three round jets operating under unheated subsonic conditions to address the question: "How close is too close?" Although sound sources are distributed at various distances throughout a jet plume downstream of the nozzle exit, at great distances from the nozzle the sound will appear to emanate from a point and the inverse-square law can be properly applied. Examination of normalized sound spectra at different distances from a jet, from experiments and from computational tools, established the required minimum distance for valid far-field measurements of the sound from subsonic round jets. Experimental data were acquired in the Aeroacoustic Propulsion Laboratory at the NASA Glenn Research Center. The WIND computer program solved the Reynolds-Averaged Navier-Stokes equations for aerodynamic computations; the MGBK jet-noise prediction computer code was used to predict the sound pressure levels. Results from both the experiments and the analytical exercises indicated that while the shortest measurement arc (with radius approximately 8 nozzle diameters) was already in the geometric far field for high-frequency sound (Strouhal number >5), low-frequency sound (Strouhal number <0.2) reached the geometric far field at a measurement radius of at least 50 nozzle diameters because of its extended source distribution.
Lassnig, R.; Striedinger, B.; Hollerer, M.; Fian, A.; Stadlober, B.; Winkler, A.
2015-01-01
The fabrication of organic thin film transistors with highly reproducible characteristics presents a very challenging task. We have prepared and analyzed model pentacene thin film transistors under ultra-high vacuum conditions, employing surface analytical tools and methods. Intentionally contaminating the gold contacts and SiO2 channel area with carbon through repeated adsorption, dissociation, and desorption of pentacene proved to be very advantageous in the creation of devices with stable and reproducible parameters. We mainly focused on the device properties, such as mobility and threshold voltage, as a function of film morphology and preparation temperature. At 300 K, pentacene displays Stranski-Krastanov growth, whereas at 200 K fine-grained, layer-like film growth takes place, which predominantly influences the threshold voltage. Temperature dependent mobility measurements demonstrate good agreement with the established multiple trapping and release model, which in turn indicates a predominant concentration of shallow traps in the crystal grains and at the oxide-semiconductor interface. Mobility and threshold voltage measurements as a function of coverage reveal that up to four full monolayers contribute to the overall charge transport. A significant influence on the effective mobility also stems from the access resistance at the gold contact-semiconductor interface, which is again strongly influenced by the temperature dependent, characteristic film growth mode. PMID:25814770
NASA Astrophysics Data System (ADS)
Lassnig, R.; Striedinger, B.; Hollerer, M.; Fian, A.; Stadlober, B.; Winkler, A.
2014-09-01
The fabrication of organic thin film transistors with highly reproducible characteristics presents a very challenging task. We have prepared and analyzed model pentacene thin film transistors under ultra-high vacuum conditions, employing surface analytical tools and methods. Intentionally contaminating the gold contacts and SiO2 channel area with carbon through repeated adsorption, dissociation, and desorption of pentacene proved to be very advantageous in the creation of devices with stable and reproducible parameters. We mainly focused on the device properties, such as mobility and threshold voltage, as a function of film morphology and preparation temperature. At 300 K, pentacene displays Stranski-Krastanov growth, whereas at 200 K fine-grained, layer-like film growth takes place, which predominantly influences the threshold voltage. Temperature dependent mobility measurements demonstrate good agreement with the established multiple trapping and release model, which in turn indicates a predominant concentration of shallow traps in the crystal grains and at the oxide-semiconductor interface. Mobility and threshold voltage measurements as a function of coverage reveal that up to four full monolayers contribute to the overall charge transport. A significant influence on the effective mobility also stems from the access resistance at the gold contact-semiconductor interface, which is again strongly influenced by the temperature dependent, characteristic film growth mode.
Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.
Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline
2017-01-01
Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.
Platform for Automated Real-Time High Performance Analytics on Medical Image Data.
Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A
2018-03-01
Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Analytical Chemistry in the Regulatory Science of Medical Devices.
Wang, Yi; Guan, Allan; Wickramasekara, Samanthi; Phillips, K Scott
2018-06-12
In the United States, regulatory science is the science of developing new tools, standards, and approaches to assess the safety, efficacy, quality, and performance of all Food and Drug Administration-regulated products. Good regulatory science facilitates consumer access to innovative medical devices that are safe and effective throughout the Total Product Life Cycle (TPLC). Because the need to measure things is fundamental to the regulatory science of medical devices, analytical chemistry plays an important role, contributing to medical device technology in two ways: It can be an integral part of an innovative medical device (e.g., diagnostic devices), and it can be used to support medical device development throughout the TPLC. In this review, we focus on analytical chemistry as a tool for the regulatory science of medical devices. We highlight recent progress in companion diagnostics, medical devices on chips for preclinical testing, mass spectrometry for postmarket monitoring, and detection/characterization of bacterial biofilm to prevent infections.
MASS SPECTROMETRY-BASED METABOLOMICS
Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.
2007-01-01
This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475
The use of child drawings to explore the dual↔group analytic field in child analysis.
Molinari, Elena
2013-04-01
Awareness that the child is part of a complex relational system has ensured that all child analysts agree on the necessity of establishing a therapeutic alliance with the parents. Unconscious conflictual dynamics involve the child analyst and include him, from the time of the initial consultation, in an analytic field that is closer to that of a group than to the bi-personal set-up of therapy with adults. Through a clinical example, the author hypothesizes that the child's drawings and play can be viewed as tools capable of mapping the unconscious emotions present in an analytic field that extends beyond the analyst-child couple. Play and drawings can be used in the relationship with the parents not in an explanatory sense, but as a probe with which to explore the universe of unconscious emotions present in the group field. The images or the story of the play used with this particular modality prove to be an attractive pathway that is effective in facilitating the alpha function of each of the members of the group. Furthermore, in this sense, they create the conditions for an occasion through which the parents can become more aware of their own unconscious emotions that have been entrusted to the child and expressed through his symptomatology. The possibility for the little group of subjects involved in a child analysis for oscillation in a dual-group field permits not only a shared experience of knowledge, but also a shared creativity aimed at knowledge of emotional truth (O). Copyright © 2012 Institute of Psychoanalysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ikonomou, M.G.; Crewe, N.F.; Fischer, M.
It has been demonstrated that marine mammals accumulate high concentrations of lipophilic organochlorine contaminants in blubber. As predators of the high trophic level they have also been used to evaluate contamination in the marine environment. Sampling of living marine mammals using a microsample (100 to 200 mg) biopsy dart technique offers a potentially invaluable chronicle in assessing levels and types of persistent environmental pollutants from a sample in which age, sex and other genetic information can additionally be ascertained. The authors have explored analytical methodology based on a high sensitivity detection system (HRGC/HRMS) which provides multi-residue determinations from biopsy dartmore » microsamples. Lipid content and the concentrations of PCDDs, PCDFs and non-ortho and mono-ortho substituted PCBs were measured in 100 mg biopsy dart replicates taken from a killer whale carcass and in three strata of the blubber of that carcass. Statistically acceptable results were obtained from the dart replicates which compared very well with those of the blubber strata. Analytical data from 100 mg extractions from an established in house blubber CRM also compared well against a series of 2.5 g extractions of that CRM. The extraction and cleanup procedures used also allow for the determination of other organohalogen contaminants such as DDT and other pesticides, all the remaining PCBs, polychlorinated diphenylethers and brominated residues. The strengths and limitations of the analytical methodology and the biopsy dart as a sampling tool and pollution predicator will be illustrated in terms of method accuracy and precision, glassware and procedural blanks associated with each extraction batch, and the incorporation of an in house micro reference standard.« less
Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2018-04-03
Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
Lohr, Cheryl; Passeretto, Kellie; Lohr, Michael; Keighery, Greg
2015-12-01
Along the Pilbara coast of Western Australia (WA) there are approximately 598 islands with a total area of around 500 km(2). Budget limitations and logistical complexities mean the management of these islands tends to be opportunistic. Until now there has been no review of the establishment and impacts of weeds on Pilbara islands or any attempt to prioritise island weed management. In many instances only weed occurrence has been documented, creating a data deficient environment for management decision making. The purpose of this research was to develop a database of weed occurrences on WA islands and to create a prioritisation process that will generate a ranked list of island-weed combinations using currently available data. Here, we describe a model using the pairwise comparison formulae in the Analytical Hierarchy Process (AHP), four metrics describing the logistical difficulty of working on each island (island size, ruggedness, travel time, and tenure), and two well established measures of conservation value of an island (maximum representation and effective maximum rarity of eight features). We present the sensitivity of the island-weed rankings to changes in weights applied to each decision criteria using Kendall's tau statistics. We also present the top 20 ranked island-weed combinations for four modelling scenarios. Many conservation prioritisation tools exist. However, many of these tools require extrapolation to fill data gaps and require specific management objectives and dedicated budgets. To our knowledge, this study is one of a few attempts to prioritise conservation actions using data that are currently available in an environment where management may be opportunistic and spasmodic due to budgetary restrictions.
Benej, Martin; Bendlova, Bela; Vaclavikova, Eliska; Poturnajova, Martina
2011-10-06
Reliable and effective primary screening of mutation carriers is the key condition for common diagnostic use. The objective of this study is to validate the method high resolution melting (HRM) analysis for routine primary mutation screening and accomplish its optimization, evaluation and validation. Due to their heterozygous nature, germline point mutations of c-RET proto-oncogene, associated to multiple endocrine neoplasia type 2 (MEN2), are suitable for HRM analysis. Early identification of mutation carriers has a major impact on patients' survival due to early onset of medullary thyroid carcinoma (MTC) and resistance to conventional therapy. The authors performed a series of validation assays according to International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines for validation of analytical procedures, along with appropriate design and optimization experiments. After validated evaluation of HRM, the method was utilized for primary screening of 28 pathogenic c-RET mutations distributed among nine exons of c-RET gene. Validation experiments confirm the repeatability, robustness, accuracy and reproducibility of HRM. All c-RET gene pathogenic variants were detected with no occurrence of false-positive/false-negative results. The data provide basic information about design, establishment and validation of HRM for primary screening of genetic variants in order to distinguish heterozygous point mutation carriers among the wild-type sequence carriers. HRM analysis is a powerful and reliable tool for rapid and cost-effective primary screening, e.g., of c-RET gene germline and/or sporadic mutations and can be used as a first line potential diagnostic tool.
Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.
Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth
2017-07-01
To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.
ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress
NASA Technical Reports Server (NTRS)
Kempler, Steven
2015-01-01
The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.