Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.
Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R
2017-04-18
Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.
Helios: Understanding Solar Evolution Through Text Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randazzese, Lucien
This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less
World Energy Projection System Plus Model Documentation: Coal Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Transportation Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Residential Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Refinery Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Main Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Electricity Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
NASA Astrophysics Data System (ADS)
Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.
2016-12-01
A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
World Energy Projection System Plus Model Documentation: Greenhouse Gases Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Natural Gas Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: District Heat Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Industrial Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
2009-09-01
nuclear industry for conducting performance assessment calculations. The analytical FORTRAN code for the DNAPL source function, REMChlor, was...project. The first was to apply existing deterministic codes , such as T2VOC and UTCHEM, to the DNAPL source zone to simulate the remediation processes...but describe the spatial variability of source zones unlike one-dimensional flow and transport codes that assume homogeneity. The Lagrangian models
Exploring Middle School Students' Use of Inscriptions in Project-Based Science Classrooms
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Krajcik, Joseph S.
2006-01-01
This study explores seventh graders' use of inscriptions in a teacher-designed project-based science unit. To investigate students' learning practices during the 8-month water quality unit, we collected multiple sources of data (e.g., classroom video recordings, student artifacts, and teacher interviews) and employed analytical methods that drew…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youn, H; Jeon, H; Nam, J
Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law.more » In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.« less
NASA Astrophysics Data System (ADS)
Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.
2017-12-01
Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.
NASA Astrophysics Data System (ADS)
Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.
2015-07-01
The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2013-12-01
A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.
Remediation Evaluation Model for Chlorinated Solvents (REMChlor)
A new analytical solution has been developed for simulating the transient effects of groundwater source and plume remediation. This development was performed as part of a Strategic Environmental Research and Development Program (SERDP) research project, which was a joint effort ...
Mars Methane Analogue Mission (M3): Analytical Techniques and Operations
NASA Astrophysics Data System (ADS)
Cloutis, E.; Vrionis, H.; Qadi, A.; Bell, J. F.; Berard, G.; Boivin, A.; Ellery, A.; Jamroz, W.; Kruzelecky, R.; Mann, P.; Samson, C.; Stromberg, J.; Strong, K.; Tremblay, A.; Whyte, L.; Wing, B.
2011-03-01
The Mars Methane Analogue Mission (M3) project is designed to simulate a rover-based search for, and analysis of, methane sources on Mars at a serpentinite open pit mine in Quebec, using a variety of instruments.
Integration of bus stop counts data with census data for improving bus service.
DOT National Transportation Integrated Search
2016-04-01
This research project produced an open source transit market data visualization and analysis tool suite, : The Bus Transit Market Analyst (BTMA), which contains user-friendly GIS mapping and data : analytics tools, and state-of-the-art transit demand...
NASA Astrophysics Data System (ADS)
Manzolaro, Mattia; Meneghetti, Giovanni; Andrighetto, Alberto
2010-11-01
In a facility for the production of radioactive ion beams (RIBs), the target system and the ion source are the most critical objects. In the context of the Selective Production of Exotic Species (SPES) project, a proton beam directly impinges a Uranium Carbide production target, generating approximately 10 13 fissions per second. The radioactive isotopes produced by the 238U fissions are then directed to the ion source to acquire a charge state. After that, the radioactive ions obtained are transported electrostatically to the subsequent areas of the facility. In this work the surface ion source at present adopted for the SPES project is studied by means of both analytical and numerical thermal-electric models. The theoretical results are compared with temperature and electric potential difference measurements.
SWMM5 Application Programming Interface and PySWMM: A Python Interfacing Wrapper
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ...
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
Search Analytics: Automated Learning, Analysis, and Search with Open Source
NASA Astrophysics Data System (ADS)
Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.
2016-12-01
The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.
VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less
This project will demonstrate ways to detect contaminants by LC/MS technologies in order to protect water systems and environments. Contaminants can affect drinking water usage and limit acceptable sources of ground and reservoir supplies. The analytical method to enhance the s...
Analysing Children's Drawings: Applied Imagination
ERIC Educational Resources Information Center
Bland, Derek
2012-01-01
This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children's imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular, how imaginative content was analysed and how the analytical process was dependent on an…
Analytical Study of 90Sr Betavoltaic Nuclear Battery Performance Based on p-n Junction Silicon
NASA Astrophysics Data System (ADS)
Rahastama, Swastya; Waris, Abdul
2016-08-01
Previously, an analytical calculation of 63Ni p-n junction betavoltaic battery has been published. As the basic approach, we reproduced the analytical simulation of 63Ni betavoltaic battery and then compared it to previous results using the same design of the battery. Furthermore, we calculated its maximum power output and radiation- electricity conversion efficiency using semiconductor analysis method.Then, the same method were applied to calculate and analyse the performance of 90Sr betavoltaic battery. The aim of this project is to compare the analytical perfomance results of 90Sr betavoltaic battery to 63Ni betavoltaic battery and the source activity influences to performance. Since it has a higher power density, 90Sr betavoltaic battery yields more power than 63Ni betavoltaic battery but less radiation-electricity conversion efficiency. However, beta particles emitted from 90Sr source could travel further inside the silicon corresponding to stopping range of beta particles, thus the 90Sr betavoltaic battery could be designed thicker than 63Ni betavoltaic battery to achieve higher conversion efficiency.
CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 4, July/August 2011
2011-07-01
Project Management Tool (SSPMT), JASMINE , and ALADDIN, respectively [11, 12]. SSPMT is a web-based Six Sigma project management sup- porting tool...PSP/TSP data gathered from JASMINE and ALADDIN, SSPMT performs each step of DMAIC and provides analytic results. JASMINE and ALADDIN are web-based...done by using JASMINE . JASMINE collects an individual developer’s work product information such as Source Lines of Code (SLOC), fault counts, and
INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ...
Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews
The Independent Technical Analysis Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.
2007-04-13
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.
GRACKLE: a chemistry and cooling library for astrophysics
NASA Astrophysics Data System (ADS)
Smith, Britton D.; Bryan, Greg L.; Glover, Simon C. O.; Goldbaum, Nathan J.; Turk, Matthew J.; Regan, John; Wise, John H.; Schive, Hsi-Yu; Abel, Tom; Emerick, Andrew; O'Shea, Brian W.; Anninos, Peter; Hummels, Cameron B.; Khochfar, Sadegh
2017-04-01
We present the GRACKLE chemistry and cooling library for astrophysical simulations and models. GRACKLE provides a treatment of non-equilibrium primordial chemistry and cooling for H, D and He species, including H2 formation on dust grains; tabulated primordial and metal cooling; multiple ultraviolet background models; and support for radiation transfer and arbitrary heat sources. The library has an easily implementable interface for simulation codes written in C, C++ and FORTRAN as well as a PYTHON interface with added convenience functions for semi-analytical models. As an open-source project, GRACKLE provides a community resource for accessing and disseminating astrochemical data and numerical methods. We present the full details of the core functionality, the simulation and PYTHON interfaces, testing infrastructure, performance and range of applicability. GRACKLE is a fully open-source project and new contributions are welcome.
Trnka, Radek; Lačev, Alek; Balcar, Karel; Kuška, Martin; Tavel, Peter
2016-01-01
The widely accepted two-dimensional circumplex model of emotions posits that most instances of human emotional experience can be understood within the two general dimensions of valence and activation. Currently, this model is facing some criticism, because complex emotions in particular are hard to define within only these two general dimensions. The present theory-driven study introduces an innovative analytical approach working in a way other than the conventional, two-dimensional paradigm. The main goal was to map and project semantic emotion space in terms of mutual positions of various emotion prototypical categories. Participants (N = 187; 54.5% females) judged 16 discrete emotions in terms of valence, intensity, controllability and utility. The results revealed that these four dimensional input measures were uncorrelated. This implies that valence, intensity, controllability and utility represented clearly different qualities of discrete emotions in the judgments of the participants. Based on this data, we constructed a 3D hypercube-projection and compared it with various two-dimensional projections. This contrasting enabled us to detect several sources of bias when working with the traditional, two-dimensional analytical approach. Contrasting two-dimensional and three-dimensional projections revealed that the 2D models provided biased insights about how emotions are conceptually related to one another along multiple dimensions. The results of the present study point out the reductionist nature of the two-dimensional paradigm in the psychological theory of emotions and challenge the widely accepted circumplex model. PMID:27148130
NASA Astrophysics Data System (ADS)
Hoenders, Bernhard J.; Ferwerda, Hedzer A.
1998-09-01
We separate the field generated by a spherically symmetric bounded scalar monochromatic source into a radiative and non-radiative part. The non-radiative part is obtained by projecting the total field on the space spanned by the non-radiating inhomogeneous modes, i.e. the modes which satisfy the inhomogeneous wave equation. Using residue techniques, introduced by Cauchy, we obtain an explicit analytical expression for the non-radiating component. We also identify the part of the source distribution which corresponds to this non-radiating part. The analysis is based on the scalar wave equation.
1988-10-01
and ZIP Code) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO. NO ACCESSION NO. 62302f ---5730 - o 00 AN 11. TITLE (Include...32 Preparative-Scale Reverse-Phase LC Fractionation of Polystyrene Homologs ..................................... 35 Work -Up of...were also employed. In addition, much of the work was based upon R-45M. However, the fundamental analytical developments and resultant practical
Investigation of Acoustical Shielding by a Wedge-Shaped Airframe
NASA Technical Reports Server (NTRS)
Gerhold, Carl H.; Clark, Lorenzo R.; Dunn, Mark H.; Tweed, John
2006-01-01
Experiments on a scale model of an advanced unconventional subsonic transport concept, the Blended Wing Body (BWB), have demonstrated significant shielding of inlet-radiated noise. A computational model of the shielding mechanism has been developed using a combination of boundary integral equation method (BIEM) and equivalent source method (ESM). The computation models the incident sound from a point source in a nacelle and determines the scattered sound field. In this way the sound fields with and without the airfoil can be estimated for comparison to experiment. An experimental test bed using a simplified wedge-shape airfoil and a broadband point noise source in a simulated nacelle has been developed for the purposes of verifying the analytical model and also to study the effect of engine nacelle placement on shielding. The experimental study is conducted in the Anechoic Noise Research Facility at NASA Langley Research Center. The analytic and experimental results are compared at 6300 and 8000 Hz. These frequencies correspond to approximately 150 Hz on the full scale aircraft. Comparison between the experimental and analytic results is quite good, not only for the noise scattering by the airframe, but also for the total sound pressure in the far field. Many of the details of the sound field that the analytic model predicts are seen or indicated in the experiment, within the spatial resolution limitations of the experiment. Changing nacelle location produces comparable changes in noise shielding contours evaluated analytically and experimentally. Future work in the project will be enhancement of the analytic model to extend the analysis to higher frequencies corresponding to the blade passage frequency of the high bypass ratio ducted fan engines that are expected to power the BWB.
Investigation of Acoustical Shielding by a Wedge-Shaped Airframe
NASA Technical Reports Server (NTRS)
Gerhold, Carl H.; Clark, Lorenzo R.; Dunn, Mark H.; Tweed, John
2004-01-01
Experiments on a scale model of an advanced unconventional subsonic transport concept, the Blended Wing Body (BWB), have demonstrated significant shielding of inlet-radiated noise. A computational model of the shielding mechanism has been developed using a combination of boundary integral equation method (BIEM) and equivalent source method (ESM). The computation models the incident sound from a point source in a nacelle and determines the scattered sound field. In this way the sound fields with and without the airfoil can be estimated for comparison to experiment. An experimental test bed using a simplified wedge-shape airfoil and a broadband point noise source in a simulated nacelle has been developed for the purposes of verifying the analytical model and also to study the effect of engine nacelle placement on shielding. The experimental study is conducted in the Anechoic Noise Research Facility at NASA Langley Research Center. The analytic and experimental results are compared at 6300 and 8000 Hz. These frequencies correspond to approximately 150 Hz on the full scale aircraft. Comparison between the experimental and analytic results is quite good, not only for the noise scattering by the airframe, but also for the total sound pressure in the far field. Many of the details of the sound field that the analytic model predicts are seen or indicated in the experiment, within the spatial resolution limitations of the experiment. Changing nacelle location produces comparable changes in noise shielding contours evaluated analytically and experimentally. Future work in the project will be enhancement of the analytic model to extend the analysis to higher frequencies corresponding to the blade passage frequency of the high bypass ratio ducted fan engines that are expected to power the BWB.
Texas Intense Positron Source (TIPS)
NASA Astrophysics Data System (ADS)
O'Kelly, D.
2003-03-01
The Texas Intense Positron Source (TIPS) is a state of the art variable energy positron beam under construction at the Nuclear Engineering Teaching Laboratory (NETL). Projected intensities on the order of the order of 10^7 e+/second using ^64Cu as the positron source are expected. Owing to is short half-life (t1/2 12.8 hrs), plans are to produce the ^64Cu isotope on-site using beam port 1 of NETL TRIGA Mark II reactor. Following tungsten moderation, the positrons will be electrostatically focused and accelerated from few 10's of eV up to 30 keV. This intensity and energy range should allow routine performance of several analytical techniques of interest to surface scientists (PALS, PADB and perhaps PAES and LEPD.) The TIPS project is being developed in parallel phases. Phase I of the project entails construction of the vacuum system, source chamber, main beam line, electrostatic/magnetic focusing and transport system as well as moderator design. Initial construction, testing and characterization of moderator and beam transport elements are underway and will use a commercially available 10 mCi ^22Na radioisotope as a source of positrons. Phase II of the project is concerned primarily with the Cu source geometry and thermal properties as well as production and physical handling of the radioisotope. Additional instrument optimizing based upon experience gained during Phase I will be incorporated in the final design. Current progress of both phases will be presented along with motivations and future directions.
Vertical amplitude phase structure of a low-frequency acoustic field in shallow water
NASA Astrophysics Data System (ADS)
Kuznetsov, G. N.; Lebedev, O. V.; Stepanov, A. N.
2016-11-01
We obtain in integral and analytic form the relations for calculating the amplitude and phase characteristics of an interference structure of orthogonal projections of the oscillation velocity vector in shallow water. For different frequencies and receiver depths, we numerically study the source depth dependences of the effective phase velocities of an equivalent plane wave, the orthogonal projections of the sound pressure phase gradient, and the projections of the oscillation velocity vector. We establish that at low frequencies in zones of interference maxima, independently of source depth, weakly varying effective phase velocity values are observed, which exceed the sound velocity in water by 5-12%. We show that the angles of arrival of the equivalent plane wave and the oscillation velocity vector in the general case differ; however, they virtually coincide in the zone of the interference maximum of the sound pressure under the condition that the horizontal projections of the oscillation velocity appreciably exceed the value of the vertical projection. We give recommendations on using the sound field characteristics in zones with maximum values for solving rangefinding and signal-detection problems.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Analytic reconstruction algorithms for triple-source CT with horizontal data truncation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Ming; Yu, Hengyong, E-mail: hengyong-yu@ieee.org
2015-10-15
Purpose: This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. Methods: The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and MATLAB. While the basic platform is constructed in MATLAB, the computationally intensive segments are coded in c + +, which are linked via a MEX interface. Results: A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle tomore » cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. Conclusions: The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.« less
Analytic reconstruction algorithms for triple-source CT with horizontal data truncation.
Chen, Ming; Yu, Hengyong
2015-10-01
This paper explores a triple-source imaging method with horizontal data truncation to enlarge the field of view (FOV) for big objects. The study is conducted by using theoretical analysis, mathematical deduction, and numerical simulations. The proposed algorithms are implemented in c + + and matlab. While the basic platform is constructed in matlab, the computationally intensive segments are coded in c + +, which are linked via a mex interface. A triple-source circular scanning configuration with horizontal data truncation is developed, where three pairs of x-ray sources and detectors are unevenly distributed on the same circle to cover the whole imaging object. For this triple-source configuration, a fan-beam filtered backprojection-type algorithm is derived for truncated full-scan projections without data rebinning. The algorithm is also extended for horizontally truncated half-scan projections and cone-beam projections in a Feldkamp-type framework. Using their method, the FOV is enlarged twofold to threefold to scan bigger objects with high speed and quality. The numerical simulation results confirm the correctness and effectiveness of the developed algorithms. The triple-source scanning configuration with horizontal data truncation cannot only keep most of the advantages of a traditional multisource system but also cover a larger FOV for big imaging objects. In addition, because the filtering is shift-invariant, the proposed algorithms are very fast and easily parallelized on graphic processing units.
Krumholz, Harlan M
2014-07-01
Big data in medicine--massive quantities of health care data accumulating from patients and populations and the advanced analytics that can give those data meaning--hold the prospect of becoming an engine for the knowledge generation that is necessary to address the extensive unmet information needs of patients, clinicians, administrators, researchers, and health policy makers. This article explores the ways in which big data can be harnessed to advance prediction, performance, discovery, and comparative effectiveness research to address the complexity of patients, populations, and organizations. Incorporating big data and next-generation analytics into clinical and population health research and practice will require not only new data sources but also new thinking, training, and tools. Adequately utilized, these reservoirs of data can be a practically inexhaustible source of knowledge to fuel a learning health care system. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.
2017-10-01
Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siman, W; Kappadath, S
2014-06-01
Purpose: Some common methods to solve for deadtime are (1) dual-source method, which assumes two equal activities; (2) model fitting, which requires multiple acquisitions as source decays; and (3) lossless model, which assumes no deadtime loss at low count rates. We propose a new analytic alternative solution to calculate deadtime for paralyzable gamma camera. Methods: Deadtime T can be calculated analytically from two distinct observed count rates M1 and M2 when the ratio of the true count rates alpha=N2/N1 is known. Alpha can be measured as a ratio of two measured activities using dose calibrators or via radioactive decay. Knowledgemore » of alpha creates a system with 2 equations and 2 unknowns, i.e., T and N1. To verify the validity of the proposed method, projections of a non-uniform phantom (4GBq 99mTc) were acquired in using Siemens SymbiaS multiple times over 48 hours. Each projection has >100kcts. The deadtime for each projection was calculated by fitting the data to a paralyzable model and also by using the proposed 2-acquisition method. The two estimates of deadtime were compared using the Bland-Altmann method. In addition, the dependency of uncertainty in T on uncertainty in alpha was investigated for several imaging conditions. Results: The results strongly suggest that the 2-acquisition method is equivalent to the fitting method. The Bland-Altman analysis yielded mean difference in deadtime estimate of ∼0.076us (95%CI: -0.049us, 0.103us) between the 2-acquisition and model fitting methods. The 95% limits of agreement were calculated to be -0.104 to 0.256us. The uncertainty in deadtime calculated using the proposed method is highly dependent on the uncertainty in the ratio alpha. Conclusion: The 2-acquisition method was found to be equivalent to the parameter fitting method. The proposed method offers a simpler and more practical way to analytically solve for a paralyzable detector deadtime, especially during physics testing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N; Piburn, Jesse O; Sorokine, Alexandre
The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of thismore » integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less
PYCHEM: a multivariate analysis package for python.
Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston
2006-10-15
We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem
Active Control of Inlet Noise on the JT15D Turbofan Engine
NASA Technical Reports Server (NTRS)
Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.
1999-01-01
This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.
NASA Astrophysics Data System (ADS)
Kingston, Andrew M.; Myers, Glenn R.; Latham, Shane J.; Li, Heyang; Veldkamp, Jan P.; Sheppard, Adrian P.
2016-10-01
With the GPU computing becoming main-stream, iterative tomographic reconstruction (IR) is becoming a com- putationally viable alternative to traditional single-shot analytical methods such as filtered back-projection. IR liberates one from the continuous X-ray source trajectories required for analytical reconstruction. We present a family of novel X-ray source trajectories for large-angle CBCT. These discrete (sparsely sampled) trajectories optimally fill the space of possible source locations by maximising the degree of mutually independent information. They satisfy a discrete equivalent of Tuy's sufficiency condition and allow high cone-angle (high-flux) tomog- raphy. The highly isotropic nature of the trajectory has several advantages: (1) The average source distance is approximately constant throughout the reconstruction volume, thus avoiding the differential-magnification artefacts that plague high cone-angle helical computed tomography; (2) Reduced streaking artifacts due to e.g. X-ray beam-hardening; (3) Misalignment and component motion manifests as blur in the tomogram rather than double-edges, which is easier to automatically correct; (4) An approximately shift-invariant point-spread-function which enables filtering as a pre-conditioner to speed IR convergence. We describe these space-filling trajectories and demonstrate their above-mentioned properties compared with a traditional helical trajectories.
NASA Astrophysics Data System (ADS)
Macleod, Christopher Kit; Braga, Joao; Arts, Koen; Ioris, Antonio; Han, Xiwu; Sripada, Yaji; van der Wal, Rene
2016-04-01
The number of local, national and international networks of online environmental sensors are rapidly increasing. Where environmental data are made available online for public consumption, there is a need to advance our understanding of the relationships between the supply of and the different demands for such information. Understanding how individuals and groups of users are using online information resources may provide valuable insights into their activities and decision making. As part of the 'dot.rural wikiRivers' project we investigated the potential of web analytics and an online survey to generate insights into the use of a national network of river level data from across Scotland. These sources of online information were collected alongside phone interviews with volunteers sampled from the online survey, and interviews with providers of online river level data; as part of a larger project that set out to help improve the communication of Scotland's online river data. Our web analytics analysis was based on over 100 online sensors which are maintained by the Scottish Environmental Protection Agency (SEPA). Through use of Google Analytics data accessed via the R Ganalytics package we assessed: if the quality of data provided by Google Analytics free service is good enough for research purposes; if we could demonstrate what sensors were being used, when and where; how the nature and pattern of sensor data may affect web traffic; and whether we can identify and profile these users based on information from traffic sources. Web analytics data consists of a series of quantitative metrics which capture and summarize various dimensions of the traffic to a certain web page or set of pages. Examples of commonly used metrics include the number of total visits to a site and the number of total page views. Our analyses of the traffic sources from 2009 to 2011 identified several different major user groups. To improve our understanding of how the use of this national network of river level data may provide insights into the interactions between individuals and their usage of hydrological information, we ran an online survey linked to the SEPA river level pages for one year. We collected over 2000 complete responses to the survey. The survey included questions on user activities and the importance of river level information for their activities; alongside questions on what additional information they used in their decision making e.g. precipitation, and when and what river pages they visited. In this presentation we will present results from our analysis of the web analytics and online survey, and the insights they provide to understanding user groups of this national network of river level data.
Survivability as a Tool for Evaluating Open Source Software
2015-06-01
the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses
A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry
ERIC Educational Resources Information Center
Adami, Gianpiero
2006-01-01
A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…
Hanford analytical sample projections FY 1998--FY 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, S.M.
1998-02-12
Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less
Cölfen, Helmut; Laue, Thomas M; Wohlleben, Wendel; Schilling, Kristian; Karabudak, Engin; Langhorst, Bradley W; Brookes, Emre; Dubbs, Bruce; Zollars, Dan; Rocco, Mattia; Demeler, Borries
2010-02-01
Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
Improving Cancer Detection and Dose Efficiency in Dedicated Breast Cancer CT
2010-02-01
source trajectory and data truncation, which can however be solved with the back-projection filtration ( BPF ) algorithm [6,7]. I have used the BPF ...high to low radiation dose levels. I have investigated noise properties in images reconstructed by use of FDK and BPF algorithms at different noise...analytic algorithms such as the FDK and BPF algorithms are applied to sparse-view data, the reconstruction images will contain artifacts such as streak
2013-07-01
water resource project planning and management; the authors also seek to identify any research needs to accommodate that goal. This technical note and...review of the state of the science of EGS and highlights the types of analytical tools, techniques, and considerations that would be needed within a...instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
Analytical solutions describing the time-dependent DNAPL source-zone mass and contaminant discharge rate are used as a flux-boundary condition in a semi-analytical contaminant transport model. These analytical solutions assume a power relationship between the flow-averaged sourc...
Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies
Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai
2013-01-01
Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379
NASA Astrophysics Data System (ADS)
Li, Dan; Xu, Feng; Jiang, Jing Fei; Zhang, Jian Qiu
2017-12-01
In this paper, a biquaternion beamspace, constructed by projecting the original data of an electromagnetic vector-sensor array into a subspace of a lower dimension via a quaternion transformation matrix, is first proposed. To estimate the direction and polarization angles of sources, biquaternion beamspace multiple signal classification (BB-MUSIC) estimators are then formulated. The analytical results show that the biquaternion beamspaces offer us some additional degrees of freedom to simultaneously achieve three goals. One is to save the memory spaces for storing the data covariance matrix and reduce the computation efforts of the eigen-decomposition. Another is to decouple the estimations of the sources' polarization parameters from those of their direction angles. The other is to blindly whiten the coherent noise of the six constituent antennas in each vector-sensor. It is also shown that the existing biquaternion multiple signal classification (BQ-MUSIC) estimator is a specific case of our BB-MUSIC ones. The simulation results verify the correctness and effectiveness of the analytical ones.
Open-source Framework for Storing and Manipulation of Plasma Chemical Reaction Data
NASA Astrophysics Data System (ADS)
Jenkins, T. G.; Averkin, S. N.; Cary, J. R.; Kruger, S. E.
2017-10-01
We present a new open-source framework for storage and manipulation of plasma chemical reaction data that has emerged from our in-house project MUNCHKIN. This framework consists of python scripts and C + + programs. It stores data in an SQL data base for fast retrieval and manipulation. For example, it is possible to fit cross-section data into most widely used analytical expressions, calculate reaction rates for Maxwellian distribution functions of colliding particles, and fit them into different analytical expressions. Another important feature of this framework is the ability to calculate transport properties based on the cross-section data and supplied distribution functions. In addition, this framework allows the export of chemical reaction descriptions in LaTeX format for ease of inclusion in scientific papers. With the help of this framework it is possible to generate corresponding VSim (Particle-In-Cell simulation code) and USim (unstructured multi-fluid code) input blocks with appropriate cross-sections.
ERIC Educational Resources Information Center
Herber, Daniel R.; Deshmukh, Anand P.; Mitchell, Marlon E.; Allison, James T.
2016-01-01
This paper presents an effort to revitalize a large introductory engineering course for incoming freshman students that teaches them analytical design through a project-based curriculum. This course was completely transformed from a seminar-based to a project-based course that integrates hands-on experimentation with analytical work. The project…
SWMM5 Application Programming Interface and PySWMM: A ...
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ... The purpose of this work is to increase the utility of the SWMM dll by creating a Toolkit API for accessing its functionality. The utility of the Toolkit is further enhanced with a wrapper to allow access from the Python scripting language. This work is being prosecuted as part of an Open Source development strategy and is being performed by volunteer software developers.
REAL-TIME MONITORING OF DIOXINS AND OTHER ...
This project is part of EPA's EMPACT program which was begun in 1998 and is jointly administered by EPA's Office of Research and Development, the National Center for Environmental Research and Quality Assurance (NCERQA), and the National Center for Environmental Assessment. The program was developed to provide understandable environmental information on various research initiatives to the public in a timely manner on various issues of importance. This particular project involves development of the application of an on-line, real time, trace organic air toxic monitor, with special emphasis on dioxin-related compounds. Research efforts demonstrate the utility and usefulness of the Resonance Enhanced Multi-Photon Ionization (REMPI) analytical method for trace organics control, monitoring, and compliance assurance. Project objectives will be to develop the REMPI instrumental method into a tool that will be used for assessment of potential dioxin sources, control and prevention of dioxin formation in known sources, and communication of facility performance. This will be accomplished through instrument development, laboratory verification, thermokinetic modelling, equilibrium modelling, statistical determinations, field validation, program publication and presentation, regulatory office support, and development of data communication/presentation procedures. For additional information on this EMPACT project, visit the website at http://www.epa.gov/appcdwww/crb/empa
Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.
2013-01-01
Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960
The IMI PROTECT project: purpose, organizational structure, and procedures.
Reynolds, Robert F; Kurz, Xavier; de Groot, Mark C H; Schlienger, Raymond G; Grimaldi-Bensouda, Lamiae; Tcherny-Lessenot, Stephanie; Klungel, Olaf H
2016-03-01
The Pharmacoepidemiological Research on Outcomes of Therapeutics by a European ConsorTium (PROTECT) initiative was a collaborative European project that sought to address limitations of current methods in the field of pharmacoepidemiology and pharmacovigilance. Initiated in 2009 and ending in 2015, PROTECT was part of the Innovative Medicines Initiative, a joint undertaking by the European Union and pharmaceutical industry. Thirty-five partners including academics, regulators, small and medium enterprises, and European Federation of Pharmaceuticals Industries and Associations companies contributed to PROTECT. Two work packages within PROTECT implemented research examining the extent to which differences in the study design, methodology, and choice of data source can contribute to producing discrepant results from observational studies on drug safety. To evaluate the effect of these differences, the project applied different designs and analytic methodology for six drug-adverse event pairs across several electronic healthcare databases and registries. This papers introduces the organizational structure and procedures of PROTECT, including how drug-adverse event and data sources were selected, study design and analyses documents were developed, and results managed centrally. Copyright © 2016 John Wiley & Sons, Ltd.
Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.
2017-12-01
We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.
Big Data Analytics Methodology in the Financial Industry
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony
2017-01-01
Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…
ADVANCED TOOLS FOR ASSESSING SELECTED ...
The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxymethamphetamine). The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technica
ADVANCED TOOLS FOR ASSESSING SELECTED ...
The purpose of this poster is to present the application and assessment of advanced state-of-the-art technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs [azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, methylenedioxymethamphetamine (MDMA)]. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansi
PHARMACEUTICALS IN SOURCE WATER - OVERVIEW ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
NASA Astrophysics Data System (ADS)
Hartnett, H. E.
2011-12-01
Many undergraduates express strong interests in research and in interdisciplinary sciences and yet, when it comes down to learning interdisciplinary material they are either unprepared for or overwhelmed by the complex interactions and relationships inherent in studying biogeochemical systems. My NSF-CAREER project "Transformation and transport of Organic Carbon in the Colorado River-Reservoir System" (EAR #0846188) combines field research with state-of-the-art analytical techniques to explore the source, fate and transport of terrestrial and riverine organic carbon in a heavily managed river system. In an effort to get undergraduates involved in research where they can really get their feet wet, I have been engaging undergraduates in a variety of field research projects that examine carbon biogeochemistry in the Colorado River watershed. The goal is to provide opportunities for students in Chemistry and in the Earth Sciences to directly experience the complexity of an environmental system, and to begin to ask manageable research questions that can be answered through field and lab work. These students are involved either as undergraduate research assistants, or as participants in my Field Geochemistry course which is offered through both the Dept. of Chemistry and the School of Earth and Space Exploration. There have been some unexpected challenges to getting these field-research projects started, but students are now successfully developing independent questions related to the larger scientific goals of the project and executing experimental and analytical research projects. To date, the PI has mentored 6 undergraduates and 2 graduate students as part of this project.
Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions
ERIC Educational Resources Information Center
Berge, Maria; Ingerman, Åke
2017-01-01
Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…
DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...
Mission and Objectives for the X-1 Advanced Radiation Source*
NASA Astrophysics Data System (ADS)
Rochau, Gary E.; Ramirez, Juan J.; Raglin, Paul S.
1998-11-01
Sandia National Laboratories PO Box 5800, MS-1178, Albuquerque, NM 87185 The X-1 Advanced Radiation Source represents a next step in providing the U.S. Department of Energy's Stockpile Stewardship Program with the high-energy, large volume, laboratory x-ray source for the Radiation Effects Science and Simulation, Inertial Confinement Fusion, and Weapon Physics Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator provide sufficient basis for pursuing the development of X-1. The X-1 plan follows a strategy based on scaling the 2 MJ x-ray output on Z via a 3-fold increase in z-pinch load current. The large volume (>5 cm3), high temperature (>150 eV), temporally long (>10 ns) hohlraums are unique outside of underground nuclear weapon testing. Analytical scaling arguments and hydrodynamic simulations indicate that these hohlraums at temperatures of 230-300 eV will ignite thermonuclear fuel and drive the reaction to a yield of 200 to 1,200 MJ in the laboratory. Non-ignition sources will provide cold x-ray environments (<15 keV) and high yield fusion burn sources will provide high fidelity warm x-ray environments (15 keV-80 keV). This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the project mission, objective, and preliminary schedule.
Isolation transformers for utility-interactive photovoltaic systems
NASA Astrophysics Data System (ADS)
Kern, E. C., Jr.
1982-12-01
Isolation transformers are used in some photovoltaic systems to isolate the photovoltaic system common mode voltage from the utility distribution system. In early system experiments with grid connected photovoltaics, such transformers were the source of significant power losses. A project at the Lincoln Laboratory and at Allied Chemical Corporation developed an improved isolation transformer to minimize such power losses. Experimental results and an analytical model of conventional and improved transformers are presented, showing considerable reductions of losses associated with the improved transformer.
Off Grid Photovoltaic Wastewater Treatment and Management Lagoons
NASA Technical Reports Server (NTRS)
LaPlace, Lucas A.; Moody, Bridget D.
2015-01-01
The SSC wastewater treatment system is comprised of key components that require a constant source of electrical power or diesel fuel to effectively treat the wastewater. In alignment with the President's new Executive Order 13653, Planning for Federal Sustainability in the Next Decade, this project aims to transform the wastewater treatment system into a zero emissions operation by incorporating the advantages of an off grid, photovoltaic system. Feasibility of implementation will be based on an analytical evaluation of electrical data, fuel consumption, and site observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kazakevich, G.; Johnson, R.; Lebedev, V.
State of the art high-current superconducting accelerators require efficient RF sources with a fast dynamic phase and power control. This allows for compensation of the phase and amplitude deviations of the accelerating voltage in the Superconducting RF (SRF) cavities caused by microphonics, etc. Efficient magnetron transmitters with fast phase and power control are attractive RF sources for this application. They are more cost effective than traditional RF sources such as klystrons, IOTs and solid-state amplifiers used with large scale accelerator projects. However, unlike traditional RF sources, controlled magnetrons operate as forced oscillators. Study of the impact of the controlling signalmore » on magnetron stability, noise and efficiency is therefore important. This paper discusses experiments with 2.45 GHz, 1 kW tubes and verifies our analytical model which is based on the charge drift approximation.« less
NASA Astrophysics Data System (ADS)
Hakoda, Christopher; Lissenden, Clifford; Rose, Joseph L.
2018-04-01
Dispersion curves are essential to any guided wave NDE project. The Semi-Analytical Finite Element (SAFE) method has significantly increased the ease by which these curves can be calculated. However, due to misconceptions regarding theory and fragmentation based on different finite-element software, the theory has stagnated, and adoption by researchers who are new to the field has been slow. This paper focuses on the relationship between the SAFE formulation and finite element theory, and the implementation of the SAFE method in a weak form for plates, pipes, layered waveguides/composites, curved waveguides, and arbitrary cross-sections is shown. The benefits of the weak form are briefly described, as is implementation in open-source and commercial finite element software.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
An Analysis of Rocket Propulsion Testing Costs
NASA Technical Reports Server (NTRS)
Ramirez-Pagan, Carmen P.; Rahman, Shamim A.
2009-01-01
The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to build several intermediate databases in order to understand, validate, and manipulate data. These intermediate databases (validated historical account of schedule, test activity, and cost) by themselves are of great value and utility. For example, for the Project Profile, we were able to merged schedule, cost, and test activity. This kind of historical account conveys important information about sequence of events, lead time, and opportunities for improvement in future propulsion test projects. The Product Requirement Document (PRD) file is a collection of data extracted from each project PRD (technical characteristics, test requirements, and projection of cost, schedule, and test activity). This information could help expedite the development of future PRD (or equivalent document) on similar projects, and could also, when compared to the actual results, help improve projections around cost and schedule. Also, this file can be sorted by the parameter of interest to perform a visual review of potential common themes or trends. The process of searching, collecting, and validating propulsion test data encountered a lot of difficulties which then led to a set of recommendations for improvement in order to facilitate future data gathering and analysis.
Learning topography with Tangible Landscape games
NASA Astrophysics Data System (ADS)
Petrasova, A.; Tabrizian, P.; Harmon, B. A.; Petras, V.; Millar, G.; Mitasova, H.; Meentemeyer, R. K.
2017-12-01
Understanding topography and its representations is crucial for correct interpretation and modeling of surface processes. However, novice earth science and landscape architecture students often find reading topographic maps challenging. As a result, many students struggle to comprehend more complex spatial concepts and processes such as flow accumulation or sediment transport.We developed and tested a new method for teaching hydrology, geomorphology, and grading using Tangible Landscape—a tangible interface for geospatial modeling. Tangible Landscape couples a physical and digital model of a landscape through a real-time cycle of hands-on modeling, 3D scanning, geospatial computation, and projection. With Tangible Landscape students can sculpt a projection-augmented topographic model of a landscape with their hands and use a variety of tangible objects to immediately see how they are changing geospatial analytics such as contours, profiles, water flow, or landform types. By feeling and manipulating the shape of the topography, while seeing projected geospatial analytics, students can intuitively learn about 3D topographic form, its representations, and how topography controls physical processes. Tangible Landscape is powered by GRASS GIS, an open source geospatial platform with extensive libraries for geospatial modeling and analysis. As such, Tangible Landscape can be used to design a wide range of learning experiences across a large number of geoscience disciplines.As part of a graduate level course that teaches grading, 16 students participated in a series of workshops, which were developed as serious games to encourage learning through structured play. These serious games included 1) diverting rain water to a specified location with minimal changes to landscape, 2) building different combinations of landforms, and 3) reconstructing landscapes based on projected contour information with feedback.In this poster, we will introduce Tangible Landscape, and describe the games and their implementation. We will then present preliminary results of a user experience survey we conducted as part of the workshops. All developed materials and software are open source and available online.
Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...
Waltenberger, Birgit; Halabalaki, Maria; Schwaiger, Stefan; Adamopoulos, Nicolas; Allouche, Noureddine; Fiebich, Bernd L; Hermans, Nina; Jansen-Dürr, Pidder; Kesternich, Victor; Pieters, Luc; Schönbichler, Stefan; Skaltsounis, Alexios-Leandros; Tran, Hung; Trougakos, Ioannis P; Viljoen, Alvaro; Wolfender, Jean-Luc; Wolfrum, Christian; Xynos, Nikos; Stuppner, Hermann
2018-05-06
There is a rapid increase in the percentage of elderly people in Europe. Consequently, the prevalence of age-related diseases will also significantly increase. Therefore, the main goal of MediHealth, an international research project, is to introduce a novel approach for the discovery of active agents of food plants from the Mediterranean diet and other global sources that promote healthy ageing. To achieve this goal, a series of plants from the Mediterranean diet and food plants from other origins are carefully selected and subjected to in silico, cell-based, in vivo (fly and mouse models), and metabolism analyses. Advanced analytical techniques complement the bio-evaluation process for the efficient isolation and identification of the bioactive plant constituents. Furthermore, pharmacological profiling of bioactive natural products, as well as the identification and synthesis of their metabolites, is carried out. Finally, optimization studies are performed in order to proceed to the development of innovative nutraceuticals, dietary supplements or herbal medicinal products. The project is based on an exchange of researchers between nine universities and four companies from European and non-European countries, exploiting the existing complementary multidisciplinary expertise. Herein, the unique and novel approach of this interdisciplinary project is presented.
Pattern-projected schlieren imaging method using a diffractive optics element
NASA Astrophysics Data System (ADS)
Min, Gihyeon; Lee, Byung-Tak; Kim, Nac Woo; Lee, Munseob
2018-04-01
We propose a novel schlieren imaging method by projecting a random dot pattern, which is generated in a light source module that includes a diffractive optical element. All apparatuses are located in the source side, which leads to one-body sensor applications. This pattern is distorted by the deflections of schlieren objects such that the displacement vectors of random dots in the pixels can be obtained using the particle image velocity algorithm. The air turbulences induced by a burning candle, boiling pot, heater, and gas torch were successfully imaged, and it was shown that imaging up to a size of 0.7 m × 0.57 m is possible. An algorithm to correct the non-uniform sensitivity according to the position of a schlieren object was analytically derived. This algorithm was applied to schlieren images of lenses. Comparing the corrected versions to the original schlieren images, we showed a corrected uniform sensitivity of 14.15 times on average.
AMTD - Advanced Mirror Technology Development in Mechanical Stability
NASA Technical Reports Server (NTRS)
Knight, J. Brent
2015-01-01
Analytical tools and processes are being developed at NASA Marshal Space Flight Center in support of the Advanced Mirror Technology Development (AMTD) project. One facet of optical performance is mechanical stability with respect to structural dynamics. Pertinent parameters are: (1) the spacecraft structural design, (2) the mechanical disturbances on-board the spacecraft (sources of vibratory/transient motion such as reaction wheels), (3) the vibration isolation systems (invariably required to meet future science needs), and (4) the dynamic characteristics of the optical system itself. With stability requirements of future large aperture space telescopes being in the lower Pico meter regime, it is paramount that all sources of mechanical excitation be considered in both feasibility studies and detailed analyses. The primary objective of this paper is to lay out a path to perform feasibility studies of future large aperture space telescope projects which require extreme stability. To get to that end, a high level overview of a structural dynamic analysis process to assess an integrated spacecraft and optical system is included.
Long, H. Keith; Daddow, Richard L.; Farrar, Jerry W.
1998-01-01
Since 1962, the U.S. Geological Survey (USGS) has operated the Standard Reference Sample Project to evaluate the performance of USGS, cooperator, and contractor analytical laboratories that analyze chemical constituents of environmental samples. The laboratories are evaluated by using performance evaluation samples, called Standard Reference Samples (SRSs). SRSs are submitted to laboratories semi-annually for round-robin laboratory performance comparison purposes. Currently, approximately 100 laboratories are evaluated for their analytical performance on six SRSs for inorganic and nutrient constituents. As part of the SRS Project, a surplus of homogeneous, stable SRSs is maintained for purchase by USGS offices and participating laboratories for use in continuing quality-assurance and quality-control activities. Statistical evaluation of the laboratories results provides information to compare the analytical performance of the laboratories and to determine possible analytical deficiences and problems. SRS results also provide information on the bias and variability of different analytical methods used in the SRS analyses.
The Human Genome Project: big science transforms biology and medicine.
Hood, Leroy; Rowen, Lee
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called 'big science' - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project.
The Human Genome Project: big science transforms biology and medicine
2013-01-01
The Human Genome Project has transformed biology through its integrated big science approach to deciphering a reference human genome sequence along with the complete sequences of key model organisms. The project exemplifies the power, necessity and success of large, integrated, cross-disciplinary efforts - so-called ‘big science’ - directed towards complex major objectives. In this article, we discuss the ways in which this ambitious endeavor led to the development of novel technologies and analytical tools, and how it brought the expertise of engineers, computer scientists and mathematicians together with biologists. It established an open approach to data sharing and open-source software, thereby making the data resulting from the project accessible to all. The genome sequences of microbes, plants and animals have revolutionized many fields of science, including microbiology, virology, infectious disease and plant biology. Moreover, deeper knowledge of human sequence variation has begun to alter the practice of medicine. The Human Genome Project has inspired subsequent large-scale data acquisition initiatives such as the International HapMap Project, 1000 Genomes, and The Cancer Genome Atlas, as well as the recently announced Human Brain Project and the emerging Human Proteome Project. PMID:24040834
PHARMACEUTICALS IN THE ENVIRONMENT: SOURCES ...
An issue that began to receive more attention by environmental scientists in the late 1990s was the conveyancy of pharmaceuticals in the environment by way of their use in human and veterinary medical practices and personal care The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, intervi
SOURCES & ORIGINS OF PPCPS: A COMPLEX ISSUE ...
There is no abstract for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. Subtask 3: T
MEDICATION DISPOSAL AS A SOURCE FOR DRUGS AS ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
ADDRESSING EMERGING ISSUES IN WATER QUALITY ...
Public concern over cleanliness and safety of source and recreational waters has prompted researchers to look for indicators of water quality. Giving public water authorities multiple tools to measure and monitor levels of chemical contaminants, as well as chemical markers of contamination, simply and rapidly would enhance public protection. The goals of water quality are outlined in the Water Quality Multi-year Plan [http://intranet.epa.gov/ospintra/Planning/wq.pdf] and the research in this task falls under GPRA Goal 2, 2.3.2, Long Term Goals 1, 2, and 4. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena
The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less
Sensitive glow discharge ion source for aerosol and gas analysis
Reilly, Peter T. A. [Knoxville, TN
2007-08-14
A high sensitivity glow discharge ion source system for analyzing particles includes an aerodynamic lens having a plurality of constrictions for receiving an aerosol including at least one analyte particle in a carrier gas and focusing the analyte particles into a collimated particle beam. A separator separates the carrier gas from the analyte particle beam, wherein the analyte particle beam or vapors derived from the analyte particle beam are selectively transmitted out of from the separator. A glow discharge ionization source includes a discharge chamber having an entrance orifice for receiving the analyte particle beam or analyte vapors, and a target electrode and discharge electrode therein. An electric field applied between the target electrode and discharge electrode generates an analyte ion stream from the analyte vapors, which is directed out of the discharge chamber through an exit orifice, such as to a mass spectrometer. High analyte sensitivity is obtained by pumping the discharge chamber exclusively through the exit orifice and the entrance orifice.
NASA Technical Reports Server (NTRS)
Pinelli, Thomas E.; Kennedy, John M.
1990-01-01
Descriptive and analytical data regarding the flow of aerospace-based scientific and technical information (STI) in the academic community are presented. An overview is provided of the Federal Aerospace Knowledge Diffusion Research Project, illustrating a five-year program on aerospace knowledge diffusion. Preliminary results are presented of the project's research concerning the information-seeking habits, practices, and attitudes of U.S. aerospace engineering and science students and faculty. The type and amount of education and training in the use of information sources are examined. The use and importance ascribed to various information products by U.S. aerospace faculty and students including computer and other information technology is assessed. An evaluation of NASA technical reports is presented and it is concluded that NASA technical reports are rated high in terms of quality and comprehensiveness, citing Engineering Index and IAA as the most frequently used materials by faculty and students.
Speciated Elemental and Isotopic Characterization of Atmospheric Aerosols - Recent Advances
NASA Astrophysics Data System (ADS)
Shafer, M.; Majestic, B.; Schauer, J.
2007-12-01
Detailed elemental, isotopic, and chemical speciation analysis of aerosol particulate matter (PM) can provide valuable information on PM sources, atmospheric processing, and climate forcing. Certain PM sources may best be resolved using trace metal signatures, and elemental and isotopic fingerprints can supplement and enhance molecular maker analysis of PM for source apportionment modeling. In the search for toxicologically relevant components of PM, health studies are increasingly demanding more comprehensive characterization schemes. It is also clear that total metal analysis is at best a poor surrogate for the bioavailable component, and analytical techniques that address the labile component or specific chemical species are needed. Recent sampling and analytical developments advanced by the project team have facilitated comprehensive characterization of even very small masses of atmospheric PM. Historically; this level of detail was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. These advances have enabled the coupling of advanced chemical characterization to vital field sampling approaches that typically supply only very limited PM mass; e.g. (1) particle size-resolved sampling; (2) personal sampler collections; and (3) fine temporal scale sampling. The analytical tools that our research group is applying include: (1) sector field (high-resolution-HR) ICP-MS, (2) liquid waveguide long-path spectrophotometry (LWG-LPS), and (3) synchrotron x-ray absorption spectroscopy (sXAS). When coupled with an efficient and validated solubilization method, the HR-ICP-MS can provide quantitative elemental information on over 50 elements in microgram quantities of PM. The high mass resolution and enhanced signal-to-noise of HR-ICP-MS significantly advance data quality and quantity over that possible with traditional quadrupole ICP-MS. The LWG-LPS system enables an assessment of the soluble/labile components of PM, while simultaneously providing critical oxidation state speciation data. Importantly, the LWG- LPS can be deployed in a semi-real-time configuration to probe fine temporal scale variations in atmospheric processing or sources of PM. The sXAS is providing complementary oxidation state speciation of bulk PM. Using examples from our research; we will illustrate the capabilities and applications of these new methods.
McAdoo, Mitchell A.; Kozar, Mark D.
2017-11-14
This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.
The WOMBAT Attack Attribution Method: Some Results
NASA Astrophysics Data System (ADS)
Dacier, Marc; Pham, Van-Hau; Thonnard, Olivier
In this paper, we present a new attack attribution method that has been developed within the WOMBAT project. We illustrate the method with some real-world results obtained when applying it to almost two years of attack traces collected by low interaction honeypots. This analytical method aims at identifying large scale attack phenomena composed of IP sources that are linked to the same root cause. All malicious sources involved in a same phenomenon constitute what we call a Misbehaving Cloud (MC). The paper offers an overview of the various steps the method goes through to identify these clouds, providing pointers to external references for more detailed information. Four instances of misbehaving clouds are then described in some more depth to demonstrate the meaningfulness of the concept.
ON-SITE SOLID PHRASE EXTRACTION AND LABORATORY ...
Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of I -L samples. A I -L sample, however, usually provides too little analyte for full-scan data acquisition. An on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent has been developed. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-sol
IN SITU SOLID-PHASE EXTRACTION AND ANALYSIS OF ...
Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of 1-L samples. A 1-L sample, however, usually provides too little analyte for full-scan data acquisition.We have developed an on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-s
ANALYTICAL CHEMISTRY RESEARCH NEEDS FOR ...
The consensus among environmental scientists and risk assessors is that the fate and effects of pharmaceutical and personal care products (PPCPS) in the environment are poorly understood. Many classes of PPCPs have yet to be investigated. Acquisition of trends data for a suite of PPCPs (representatives from each of numerous significant classes), shown to recur amongst municipal wastewater treatment plants across the country, may prove of key importance. The focus of this paper is an overview of some of the analytical methods being developed at the Environmenental Protection Agency and their application to wastewater and surface water samples. Because PPCPs are generally micro-pollutants, emphasis is on development of enrichment and pre- concentration techniques using various means of solid-phase extraction. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCP
ERIC Educational Resources Information Center
Williamson, Nicholas C.
2001-01-01
Describes Export Odyssey (EO), a structured, Internet-intensive, team-based undergraduate student project in international marketing. Presents an analytical review of articles in the literature that relate to three key teaching-learning dimensions of student projects (experiential versus non-experiential active learning, team-based versus…
Information Management Platform for Data Analytics and Aggregation (IMPALA) System Design Document
NASA Technical Reports Server (NTRS)
Carnell, Andrew; Akinyelu, Akinyele
2016-01-01
The System Design document tracks the design activities that are performed to guide the integration, installation, verification, and acceptance testing of the IMPALA Platform. The inputs to the design document are derived from the activities recorded in Tasks 1 through 6 of the Statement of Work (SOW), with the proposed technical solution being the completion of Phase 1-A. With the documentation of the architecture of the IMPALA Platform and the installation steps taken, the SDD will be a living document, capturing the details about capability enhancements and system improvements to the IMPALA Platform to provide users in development of accurate and precise analytical models. The IMPALA Platform infrastructure team, data architecture team, system integration team, security management team, project manager, NASA data scientists and users are the intended audience of this document. The IMPALA Platform is an assembly of commercial-off-the-shelf (COTS) products installed on an Apache-Hadoop platform. User interface details for the COTS products will be sourced from the COTS tools vendor documentation. The SDD is a focused explanation of the inputs, design steps, and projected outcomes of every design activity for the IMPALA Platform through installation and validation.
Development of Wien filter for small ion gun of surface analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bahng, Jungbae; Busan Center, Korea Basic Science Institute, Busan 609-735; Hong, Jonggi
The gas cluster ion beam (GCIB) and liquid metal ion beam have been studied in the context of ion beam usage for analytical equipment in applications such as X-ray photoelectron spectroscopy and secondary ion mass spectroscopy (SIMS). In particular, small ion sources are used for the secondary ion generation and ion etching. To set the context to this study, the SIMS project has been launched to develop ion-gun based analytical equipment for the Korea Basic Science Institute. The objective of the first stage of the project is the generation of argon beams with a GCIB system [A. Kirkpatrick, Nucl. Instrum.more » Methods Phys. Res., Sect. B 206, 830–837 (2003)] that consists of a nozzle, skimmer, ionizer, acceleration tube, separation system, transport system, and target. The Wien filter directs the selected cluster beam to the target system by exploiting the velocity difference of the generated particles from GCIB. In this paper, we present the theoretical modeling and three-dimensional electromagnetic analysis of the Wien filter, which can separate Ar{sup +}{sub 2500} clusters from Ar{sup +}{sub 2400} to Ar{sup +}{sub 2600} clusters with a 1-mm collimator.« less
Glassmeyer, Susan T.; Furlong, Edward T.; Kolpin, Dana W.; Batt, Angela L.; Benson, Robert; Boone, J. Scott; Conerly, Octavia D.; Donohue, Maura J.; King, Dawn N.; Kostich, Mitchell S.; Mash, Heath E.; Pfaller, Stacy; Schenck, Kathleen M.; Simmons, Jane Ellen; Varughese, Eunice A.; Vesper, Stephen J.; Villegas, Eric N.; Wilson, Vickie S.
2017-01-01
When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte's contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwater, and drinking water.
Glassmeyer, Susan T; Furlong, Edward T; Kolpin, Dana W; Batt, Angela L; Benson, Robert; Boone, J Scott; Conerly, Octavia; Donohue, Maura J; King, Dawn N; Kostich, Mitchell S; Mash, Heath E; Pfaller, Stacy L; Schenck, Kathleen M; Simmons, Jane Ellen; Varughese, Eunice A; Vesper, Stephen J; Villegas, Eric N; Wilson, Vickie S
2017-03-01
When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte's contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwater, and drinking water. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine
2014-01-01
This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…
Risk management of drinking water relies on quality analytical data. Analytical methodology can often be adapted from environmental monitoring sources. However, risk management sometimes presents special analytical challenges because data may be needed from a source for which n...
NASA Astrophysics Data System (ADS)
Falta, R. W.
2004-05-01
Analytical solutions are developed that relate changes in the contaminant mass in a source area to the behavior of biologically reactive dissolved contaminant groundwater plumes. Based on data from field experiments, laboratory experiments, numerical streamtube models, and numerical multiphase flow models, the chemical discharge from a source region is assumed to be a nonlinear power function of the fraction of contaminant mass removed from the source zone. This function can approximately represent source zone mass discharge behavior over a wide range of site conditions ranging from simple homogeneous systems, to complex heterogeneous systems. A mass balance on the source zone with advective transport and first order decay leads to a nonlinear differential equation that is solved analytically to provide a prediction of the time-dependent contaminant mass discharge leaving the source zone. The solution for source zone mass discharge is coupled semi-analytically with a modified version of the Domenico (1987) analytical solution for three-dimensional reactive advective and dispersive transport in groundwater. The semi-analytical model then employs the BIOCHLOR (Aziz et al., 2000; Sun et al., 1999) transformations to model sequential first order parent-daughter biological decay reactions of chlorinated ethenes and ethanes in the groundwater plume. The resulting semi-analytic model thus allows for transient simulation of complex source zone behavior that is fully coupled to a dissolved contaminant plume undergoing sequential biological reactions. Analyses of several realistic scenarios show that substantial changes in the ground water plume can result from the partial removal of contaminant mass from the source zone. These results, however, are sensitive to the nature of the source mass reduction-source discharge reduction curve, and to the rates of degradation of the primary contaminant and its daughter products in the ground water plume. Aziz, C.E., C.J. Newell, J.R. Gonzales, P. Haas, T.P. Clement, and Y. Sun, 2000, BIOCHLOR Natural Attenuation Decision Support System User's Manual Version 1.0, US EPA Report EPA/600/R-00/008 Domenico, P.A., 1987, An analytical model for multidimensional transport of a decaying contaminant species, J. Hydrol., 91: 49-58. Sun, Y., J.N. Petersen, T.P. Clement, and R.S. Skeen, 1999, A new analytical solution for multi-species transport equations with serial and parallel reactions, Water Resour. Res., 35(1): 185-190.
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
An analytic approach to optimize tidal turbine fields
NASA Astrophysics Data System (ADS)
Pelz, P.; Metzler, M.
2013-12-01
Motivated by global warming due to CO2-emission various technologies for harvesting of energy from renewable sources are developed. Hydrokinetic turbines get applied to surface watercourse or tidal flow to gain electrical energy. Since the available power for hydrokinetic turbines is proportional to the projected cross section area, fields of turbines are installed to scale shaft power. Each hydrokinetic turbine of a field can be considered as a disk actuator. In [1], the first author derives the optimal operation point for hydropower in an open-channel. The present paper concerns about a 0-dimensional model of a disk-actuator in an open-channel flow with bypass, as a special case of [1]. Based on the energy equation, the continuity equation and the momentum balance an analytical approach is made to calculate the coefficient of performance for hydrokinetic turbines with bypass flow as function of the turbine head and the ratio of turbine width to channel width.
Collaborative decision-making on wind power projects based on AHP method
NASA Astrophysics Data System (ADS)
Badea, A.; Proştean, G.; Tămăşilă, M.; Vârtosu, A.
2017-01-01
The complexity of projects implementation in Renewable Energy Sources (RES) requires finding collaborative alliances between suppliers and project developers in RES. Links activities in supply chain in RES, respectively, transportation of heavy components, processing orders to purchase quality raw materials, storage and materials handling, packaging, and other complex activities requiring a logistics system collaboratively to be permanently dimensioned properly selected and monitored. Requirements imposed by stringency of wind power energy projects implementation inevitably involves constraints in infrastructure, implementation and logistics. Thus, following an extensive research in RES project, to eliminate these constraints were identified alternative collaboration to provide feasible solutions on different levels of performance. The paper presents a critical analysis of different collaboration alternatives in supply chain for RES projects, selecting the ones most suitable for particular situations by using decision-making method Analytic Hierarchy Process (AHP). The role of AHP method was to formulate a decision model by which can be establish the collaboration alternative choice through mathematical calculation to reduce the impact created by constraints encountered. The solution provided through AHP provides a framework for detecting optimal alternative collaboration between suppliers and project developers in RES and avoids some breaks in the chain by resizing safety buffers for leveling orders in RES projects.
Klassen, Tara L.; von Rüden, Eva-Lotta; Drabek, Janice; Noebels, Jeffrey L.; Goldman, Alica M.
2013-01-01
Genetic testing and research have increased the demand for high-quality DNA that has traditionally been obtained by venipuncture. However, venous blood collection may prove difficult in special populations and when large-scale specimen collection or exchange is prerequisite for international collaborative investigations. Guthrie/FTA card–based blood spots, buccal scrapes, and finger nail clippings are DNA-containing specimens that are uniquely accessible and thus attractive as alternative tissue sources (ATS). The literature details a variety of protocols for extraction of nucleic acids from a singular ATS type, but their utility has not been systematically analyzed in comparison with conventional sources such as venous blood. Additionally, the efficacy of each protocol is often equated with the overall nucleic acid yield but not with the analytical performance of the DNA during mutation detection. Together with a critical in-depth literature review of published extraction methods, we developed and evaluated an all-inclusive approach for serial, systematic, and direct comparison of DNA utility from multiple biological samples. Our results point to the often underappreciated value of these alternative tissue sources and highlight ways to maximize the ATS-derived DNA for optimal quantity, quality, and utility as a function of extraction method. Our comparative analysis clarifies the value of ATS in genomic analysis projects for population-based screening, diagnostics, molecular autopsy, medico-legal investigations, or multi-organ surveys of suspected mosaicisms. PMID:22796560
Nationwide reconnaissance of contaminants of emerging ...
When chemical or microbial contaminants are assessed for potential effect or possible regulation in ambient and drinking waters, a critical first step is determining if the contaminants occur and if they are at concentrations that may cause human or ecological health concerns. To this end, source and treated drinking water samples from 29 drinking water treatment plants (DWTPs) were analyzed as part of a two-phase study to determine whether chemical and microbial constituents, many of which are considered contaminants of emerging concern, were detectable in the waters. Of the 84 chemicals monitored in the 9 Phase I DWTPs, 27 were detected at least once in the source water, and 21 were detected at least once in treated drinking water. In Phase II, which was a broader and more comprehensive assessment, 247 chemical and microbial analytes were measured in 25 DWTPs, with 148 detected at least once in the source water, and 121 detected at least once in the treated drinking water. The frequency of detection was often related to the analyte's contaminant class, as pharmaceuticals and anthropogenic waste indicators tended to be infrequently detected and more easily removed during treatment, while per and polyfluoroalkyl substances and inorganic constituents were both more frequently detected and, overall, more resistant to treatment. The data collected as part of this project will be used to help inform evaluation of unregulated contaminants in surface water, groundwate
Badiola, Katrina A.; Bird, Colin; Brocklesby, William S.; Casson, John; Chapman, Richard T.; Coles, Simon J.; Cronshaw, James R.; Fisher, Adam; Gloria, Danmar; Grossel, Martin C.; Hibbert, D. Brynn; Knight, Nicola; Mapp, Lucy K.; Marazzi, Luke; Matthews, Brian; Milsted, Andy; Minns, Russell S.; Mueller, Karl T.; Murphy, Kelly; Parkinson, Tim; Quinnell, Rosanne; Robinson, John S.; Robertson, Murray N.; Robins, Michael; Springate, Emma; Tizzard, Graham; Todd, Matthew H.; Williamson, Alice E.; Willoughby, Cerys; Yang, Erica; Ylioja, Paul M.
2015-01-01
Electronic Laboratory Notebooks (ELNs) are progressively replacing traditional paper books in both commercial research establishments and academic institutions. University researchers require specific features from ELNs, given the need to promote cross-institutional collaborative working, to enable the sharing of procedures and results, and to facilitate publication. The LabTrove ELN, which we use as our exemplar, was designed to be researcher-centric (i.e., not only aimed at the individual researcher's basic needs rather than to a specific institutional or subject or disciplinary agenda, but also able to be tailored because it is open source). LabTrove is being used in a heterogeneous set of academic laboratories, for a range of purposes, including analytical chemistry, X-ray studies, drug discovery and a biomaterials project. Researchers use the ELN for recording experiments, preserving data collected, and for project coordination. This perspective article describes the experiences of those researchers from several viewpoints, demonstrating how a web-based open source electronic notebook can meet the diverse needs of academic researchers. PMID:29308130
ABSTRACT PRESENTATION--PHARMACEUTICALS AS ...
Pharmaceuticals comprise a large and diverse array of contaminants that can occur in the environmentfrom the combined activities and actions of multitudes of individuals as well as from veterinary andagricultural use. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for med
ION COMPOSITION ELUCIDATION (ICE) FOR ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICALS AND PERSONAL CARE PRODUCTS ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICAL AND PERSONAL CARE PRODUCTS IN ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
INTRODUCTION TO PHARMACEUTICALS AND PERSONAL ...
There is no abstract for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. Subtask 3: T
OVERVIEW OF PHARMACEUTICALS AND PERSONAL ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
ORIGINS AND RAMIFICATIONS OF PHARMACEUTICALS & ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICAL AND PERSONAL CARE PRODUCTS ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICALS AND PERSONAL CARE PRODUCTS ...
There is no abstract for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. Subtask 3: T
ORIGINS AND RAMIFICATIONS OF PHARMACEUTICALS ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICALS AS ENVIRONMENTAL ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
MERCURY MEASUREMENTS USING DIRECT-ANALYZER ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
SLIDE PRESENTATION--PHARMACEUTICALS AS ...
While pharmaceuticals are ubiquitous trace contaminants in the environment, thetypes, concentrations, and relative abundances of individual residues will vary depending on thegeographic locale and time of year, primarily a reflection of differing and varying prescribing andconsumption practices. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical
SYNTHETIC FRAGRANCES IN THE ENVIRONMENT ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
ENVIRONMENTAL STEWARDSHIP OF PHARMACEUTICALS ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under the Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquirie
LEVELS OF SYNTHETIC MUSKS COMPOUNDS IN ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
POTENTIAL CONCERNS/EFFECTS ON HUMAN AND ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
MEETING TODAY'S EMERGING CONTAMINANTS WITH ...
This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park). The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for
NON-REGULATED CONTAMINANTS: EMERGING ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
DETECTION OF ILLCIT DRUGS IN MUNICIPAL ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PRESCRIBING FOR THE ENVIRONMENT | Science Inventory ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
MONITORING SYNTHETIC MUSK COMPOUNDS IN ...
Synthetic musk compounds are manufactured as fragrance materials for consumer products and are consumed in very large quantities worldwide. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. Subtask 3: To apply state-of-the-art envir
A NEW HIGH RESOLUTION MASS SPECTROMEY ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
CORRELATION OF CHEMICAL MARKERS - NITRATE AND ...
Giving public water authorities another tool to monitor and measure levels of human waste contamination of waters simply and rapidly would enhance public protection. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. Subtask 3: To ap
MERCURY MEASUREMENTS USING DIRECT-ANALYZER ...
Under EPA's Water Quality Research Program, exposure studies are needed to determine how well control strategies and guidance are working. Consequently, reliable and convenient techniques that minimize waste production are of special interest. While traditional methods for determining mercury in solid samples involve the use of aggressive chemicals to dissolve the matrix and the use of other chemicals to properly reduce the mercury to the volatile elemental form, pyrolysis-based analyzers can be used by directly weighing the solid in a sampling boat and initiating the instrumental analysis for total mercury. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at con
SPECIATION AND DETECTION OF ORGANOTINS FROM ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
NEW APPROACHES FOR TRACE ANALYSIS OF ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
ANTIBIOTICS IN THE ENVIRONMENTS; LESS RECOGNIZED ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
A model for quantifying construction waste in projects according to the European waste list.
Llatas, C
2011-06-01
The new EU challenge is to recover 70% by weight of C&D waste in 2020. Literature reveals that one major barrier is the lack of data. Therefore, this paper presents a model which allows technicians to estimate C&D waste during the design stage in order to promote prevention and recovery. The types and quantities of CW are estimated and managed according to EU guidelines, by building elements and specifically for each project. The model would allow detection of the source of the waste and to adopt other alternative procedures which delete hazardous waste and reduce CW. Likewise, it develops a systematic structure of the construction process, a waste classification system and some analytical expressions which are based on factors. These factors depend on technology and represent a standard on site. It would allow to develop a database of waste anywhere. A Spanish case study is covered. Factors were obtained by studying over 20 dwellings. The source and types of packaging waste, remains, soil and hazardous waste were estimated in detail and were compared with other studies. Results reveal that the model can be implemented in projects and the chances of reducing and recovery C&D waste could be increased, well above the EU challenge. Copyright © 2011 Elsevier Ltd. All rights reserved.
Huppertz, Laura M; Kneisel, Stefan; Auwärter, Volker; Kempf, Jürgen
2014-02-01
Considering the vast variety of synthetic cannabinoids and herbal mixtures - commonly known as 'Spice' or 'K2' - on the market and the resulting increase of severe intoxications related to their consumption, there is a need in clinical and forensic toxicology for comprehensive up-to-date screening methods. The focus of this project aimed at developing and implementing an automated screening procedure for the detection of synthetic cannabinoids in serum using a liquid chromatography-ion trap-MS (LC-MS(n)) system and a spectra library-based approach, currently including 46 synthetic cannabinoids and 8 isotope labelled analogues. In the process of method development, a high-temperature ESI source (IonBooster(TM), Bruker Daltonik) and its effects on the ionization efficiency of the investigated synthetic cannabinoids were evaluated and compared to a conventional ESI source. Despite their structural diversity, all investigated synthetic cannabinoids benefitted from high-temperature ionization by showing remarkably higher MS intensities compared to conventional ESI. The employed search algorithm matches retention time, MS and MS(2)/MS(3) spectra. With the utilization of the ionBooster source, limits for the automated detection comparable to cut-off values of routine MRM methods were achieved for the majority of analytes. Even compounds not identified when using a conventional ESI source were detected using the ionBooster-source. LODs in serum range from 0.1 ng/ml to 0.5 ng/ml. The use of parent compounds as analytical targets offers the possibility of instantly adding new emerging compounds to the library and immediately applying the updated method to serum samples, allowing the rapid adaptation of the screening method to ongoing forensic or clinical requirements. The presented approach can also be applied to other specimens, such as oral fluid or hair, and herbal mixtures and was successfully applied to authentic serum samples. Quantitative MRM results of samples with analyte concentrations above the determined LOD were confirmed as positive findings by the presented method. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Chen, Jui-Sheng; Liu, Chen-Wuing; Liang, Ching-Ping; Lai, Keng-Hsin
2012-08-01
SummaryMulti-species advective-dispersive transport equations sequentially coupled with first-order decay reactions are widely used to describe the transport and fate of the decay chain contaminants such as radionuclide, chlorinated solvents, and nitrogen. Although researchers attempted to present various types of methods for analytically solving this transport equation system, the currently available solutions are mostly limited to an infinite or a semi-infinite domain. A generalized analytical solution for the coupled multi-species transport problem in a finite domain associated with an arbitrary time-dependent source boundary is not available in the published literature. In this study, we first derive generalized analytical solutions for this transport problem in a finite domain involving arbitrary number of species subject to an arbitrary time-dependent source boundary. Subsequently, we adopt these derived generalized analytical solutions to obtain explicit analytical solutions for a special-case transport scenario involving an exponentially decaying Bateman type time-dependent source boundary. We test the derived special-case solutions against the previously published coupled 4-species transport solution and the corresponding numerical solution with coupled 10-species transport to conduct the solution verification. Finally, we compare the new analytical solutions derived for a finite domain against the published analytical solutions derived for a semi-infinite domain to illustrate the effect of the exit boundary condition on coupled multi-species transport with an exponential decaying source boundary. The results show noticeable discrepancies between the breakthrough curves of all the species in the immediate vicinity of the exit boundary obtained from the analytical solutions for a finite domain and a semi-infinite domain for the dispersion-dominated condition.
Learning Dilemmas in Undergraduate Student Independent Essays
ERIC Educational Resources Information Center
Wendt, Maria; Åse, Cecilia
2015-01-01
Essay-writing is generally viewed as the primary learning activity to foster independence and analytical thinking. In this article, we show that independent research projects do not necessarily lead to critical thinking. University-level education on conducting independent projects can, in several respects, counteract enhanced analytical skills.…
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Over the last 10 years the EPA has invested in analytic elements as a computational method used in public domain software supporting capture zone delineation for source water assessments and wellhead protection. The current release is called WhAEM2000 (wellhead analytic element ...
Raskob, Wolfgang; Schneider, Thierry; Gering, Florian; Charron, Sylvie; Zhelezniak, Mark; Andronopoulos, Spyros; Heriard-Dubreuil, Gilles; Camps, Johan
2015-04-01
The PREPARE project that started in February 2013 and will end at the beginning of 2016 aims to close gaps that have been identified in nuclear and radiological preparedness in Europe following the first evaluation of the Fukushima disaster. Among others, the project will address the review of existing operational procedures for dealing with long-lasting releases and cross-border problems in radiation monitoring and food safety and further develop missing functionalities in decision support systems (DSS) ranging from improved source-term estimation and dispersion modelling to the inclusion of hydrological pathways for European water bodies. In addition, a so-called Analytical Platform will be developed exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. The tools developed within the project will be partly integrated into the two DSS ARGOS and RODOS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration
NASA Technical Reports Server (NTRS)
Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David;
2017-01-01
The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.
Crawford, C L; Hill, H H
2013-03-30
(63)Nickel radioactive ionization ((63)Ni) is the most common and widely used ion source for ion mobility spectrometry (IMS). Regulatory, financial, and operational concerns with this source have promoted recent development of non-radioactive sources, such as corona discharge ionization (CD), for stand-alone IMS systems. However, there has been no comparison of the negative ion species produced by all three sources in the literature. This study compares the negative reactant and analyte ions produced by three sources on an ion mobility-mass spectrometer: conventional (63)Ni, CD, and secondary electrospray ionization (SESI). Results showed that (63)Ni and SESI produced the same reactant ion species while CD produced only the nitrate monomer and dimer ions. The analyte ions produced by each ion source were the same except for the CD source which produced a different ion species for the explosive RDX than either the (63)Ni or SESI source. Accurate and reproducible reduced mobility (K0) values, including several values reported here for the first time, were found for each explosive with each ion source. Overall, the SESI source most closely reproduced the reactant ion species and analyte ion species profiles for (63)Ni. This source may serve as a non-radioactive, robust, and flexible alternative for (63)Ni. Copyright © 2013 Elsevier B.V. All rights reserved.
Compact blackbody calibration sources for in-flight calibration of spaceborne infrared instruments
NASA Astrophysics Data System (ADS)
Scheiding, S.; Driescher, H.; Walter, I.; Hanbuch, K.; Paul, M.; Hartmann, M.; Scheiding, M.
2017-11-01
High-emissivity blackbodies are mandatory as calibration sources in infrared radiometers. Besides the requirements on the high spectral emissivity and low reflectance, constraints regarding energy consumption, installation space and mass must be considered during instrument design. Cavity radiators provide an outstanding spectral emissivity to the price of installation space and mass of the calibration source. Surface radiation sources are mainly limited by the spectral emissivity of the functional coating and the homogeneity of the temperature distribution. The effective emissivity of a "black" surface can be optimized, by structuring the substrate with the aim to enlarge the ratio of the surface to its projection. Based on the experiences of the Mercury Radiometer and Thermal Infrared Spectrometer (MERTIS) calibration source MBB3, the results of the surface structuring on the effective emissivity are described analytically and compared to the experimental performance. Different geometries are analyzed and the production methods are discussed. The high-emissivity temperature calibration source features values of 0.99 for wavelength from 5 μm to 10 μm and emissivity larger than 0.95 for the spectral range from 10 μm to 40 μm.
Badal, Sunil P; Michalak, Shawn D; Chan, George C-Y; You, Yi; Shelley, Jacob T
2016-04-05
Plasma-based ambient desorption/ionization sources are versatile in that they enable direct ionization of gaseous samples as well as desorption/ionization of analytes from liquid and solid samples. However, ionization matrix effects, caused by competitive ionization processes, can worsen sensitivity or even inhibit detection all together. The present study is focused on expanding the analytical capabilities of the flowing atmospheric-pressure afterglow (FAPA) source by exploring additional types of ionization chemistry. Specifically, it was found that the abundance and type of reagent ions produced by the FAPA source and, thus, the corresponding ionization pathways of analytes, can be altered by changing the source working conditions. High abundance of proton-transfer reagent ions was observed with relatively high gas flow rates and low discharge currents. Conversely, charge-transfer reagent species were most abundant at low gas flows and high discharge currents. A rather nonpolar model analyte, biphenyl, was found to significantly change ionization pathway based on source operating parameters. Different analyte ions (e.g., MH(+) via proton-transfer and M(+.) via charge-transfer) were formed under unique operating parameters demonstrating two different operating regimes. These tunable ionization modes of the FAPA were used to enable or enhance detection of analytes which traditionally exhibit low-sensitivity in plasma-based ADI-MS analyses. In one example, 2,2'-dichloroquaterphenyl was detected under charge-transfer FAPA conditions, which were difficult or impossible to detect with proton-transfer FAPA or direct analysis in real-time (DART). Overall, this unique mode of operation increases the number and range of detectable analytes and has the potential to lessen ionization matrix effects in ADI-MS analyses.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
NASA Astrophysics Data System (ADS)
Ávila-Carrera, R.; Sánchez-Sesma, F. J.; Spurlin, James H.; Valle-Molina, C.; Rodríguez-Castellanos, A.
2014-09-01
An analytic formulation to understand the scattering, diffraction and attenuation of elastic waves at the neighborhood of fluid filled wells is presented. An important, and not widely exploited, technique to carefully investigate the wave propagation in exploration wells is the logging of sonic waveforms. Fundamental decisions and production planning in petroleum reservoirs are made by interpretation of such recordings. Nowadays, geophysicists and engineers face problems related to the acquisition and interpretation under complex conditions associated with conducting open-hole measurements. A crucial problem that directly affects the response of sonic logs is the eccentricity of the measuring tool with respect to the center of the borehole. Even with the employment of centralizers, this simple variation, dramatically changes the physical conditions on the wave propagation around the well. Recent works in the numerical field reported advanced studies in modeling and simulation of acoustic wave propagation around wells, including complex heterogeneities and anisotropy. However, no analytical efforts have been made to formally understand the wireline sonic logging measurements acquired with borehole-eccentered tools. In this paper, the Graf's addition theorem was used to describe monopole sources in terms of solutions of the wave equation. The formulation was developed from the three-dimensional discrete wave-number method in the frequency domain. The cylindrical Bessel functions of the third kind and order zero were re-derived to obtain a simplified set of equations projected into a bi-dimensional plane-space for displacements and stresses. This new and condensed analytic formulation allows the straightforward calculation of all converted modes and their visualization in the time domain via Fourier synthesis. The main aim was to obtain spectral surfaces of transfer functions and synthetic seismograms that might be useful to understand the wave motion produced by the eccentricity of the source and explain in detail the new arising borehole propagation modes. Finally, time histories and amplitude spectra for relevant examples are presented and the validation of time traces using the spectral element method is reported.
Social media for intelligence: practical examples of analysis for understanding
NASA Astrophysics Data System (ADS)
Juhlin, Jonas A.; Richardson, John
2016-05-01
Social media has become a dominating feature in modern life. Platforms like Facebook, Twitter, and Google have users all over the world. People from all walks of life use social media. For the intelligence services, social media is an element that cannot be ignored. It holds immense amount of information, and the potential to extract useful intelligence cannot be ignored. Social media has been around for sufficient time that most intelligence services recognize the fact that social media needs some form of attention. However, for the intelligence collector and analyst several aspects must be uncovered in order to fully exploit social media for intelligence purposes. This paper will present Project Avatar, an experiment in obtaining effective intelligence from social media sources, and several emerging analytic techniques to expand the intelligence gathered from these sources.
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
Web Analytics Reveal User Behavior: TTU Libraries' Experience with Google Analytics
ERIC Educational Resources Information Center
Barba, Ian; Cassidy, Ryan; De Leon, Esther; Williams, B. Justin
2013-01-01
Proper planning and assessment surveys of projects for academic library Web sites will not always be predictive of real world use, no matter how many responses they might receive. In this case, multiple-phase development, librarian focus groups, and patron surveys performed before implementation of such a project inaccurately overrated utility and…
Proactive Supply Chain Performance Management with Predictive Analytics
Stefanovic, Nenad
2014-01-01
Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment. PMID:25386605
ExaSAT: An exascale co-design tool for performance modeling
Unat, Didem; Chan, Cy; Zhang, Weiqun; ...
2015-02-09
One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less
Proactive supply chain performance management with predictive analytics.
Stefanovic, Nenad
2014-01-01
Today's business climate requires supply chains to be proactive rather than reactive, which demands a new approach that incorporates data mining predictive analytics. This paper introduces a predictive supply chain performance management model which combines process modelling, performance measurement, data mining models, and web portal technologies into a unique model. It presents the supply chain modelling approach based on the specialized metamodel which allows modelling of any supply chain configuration and at different level of details. The paper also presents the supply chain semantic business intelligence (BI) model which encapsulates data sources and business rules and includes the data warehouse model with specific supply chain dimensions, measures, and KPIs (key performance indicators). Next, the paper describes two generic approaches for designing the KPI predictive data mining models based on the BI semantic model. KPI predictive models were trained and tested with a real-world data set. Finally, a specialized analytical web portal which offers collaborative performance monitoring and decision making is presented. The results show that these models give very accurate KPI projections and provide valuable insights into newly emerging trends, opportunities, and problems. This should lead to more intelligent, predictive, and responsive supply chains capable of adapting to future business environment.
2010-04-01
analytical community. 5.1 Towards a Common Understanding of CD&E and CD&E Project Management Recent developments within NATO have contributed to the... project management purposes it is useful to distinguish four phases [P 21]: a) Preparation, Initiation and Structuring; b) Concept Development Planning...examined in more detail below. While the NATO CD&E policy provides a benchmark for a comprehensive, disciplined management of CD&E projects , it may
Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection
Haefner, A.; Gunter, D.; Plimley, B.; ...
2014-11-03
Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method withmore » electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.« less
ERIC Educational Resources Information Center
Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.
2016-01-01
This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…
Krakow conference on low emissions sources: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, B.L.; Butcher, T.A.
1995-12-31
The Krakow Conference on Low Emission Sources presented the information produced and analytical tools developed in the first phase of the Krakow Clean Fossil Fuels and Energy Efficiency Program. This phase included: field testing to provide quantitative data on missions and efficiencies as well as on opportunities for building energy conservation; engineering analysis to determine the costs of implementing pollution control; and incentives analysis to identify actions required to create a market for equipment, fuels, and services needed to reduce pollution. Collectively, these Proceedings contain reports that summarize the above phase one information, present the status of energy system managementmore » in Krakow, provide information on financing pollution control projects in Krakow and elsewhere, and highlight the capabilities and technologies of Polish and American companies that are working to reduce pollution from low emission sources. It is intended that the US reader will find in these Proceedings useful results and plans for control of pollution from low emission sources that are representative of heating systems in central and Eastern Europe. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.« less
NASA Astrophysics Data System (ADS)
Nielsen, Roger L.; Ustunisik, Gokce; Weinsteiger, Allison B.; Tepley, Frank J.; Johnston, A. Dana; Kent, Adam J. R.
2017-09-01
Quantitative models of petrologic processes require accurate partition coefficients. Our ability to obtain accurate partition coefficients is constrained by their dependence on pressure temperature and composition, and on the experimental and analytical techniques we apply. The source and magnitude of error in experimental studies of trace element partitioning may go unrecognized if one examines only the processed published data. The most important sources of error are relict crystals, and analyses of more than one phase in the analytical volume. Because we have typically published averaged data, identification of compromised data is difficult if not impossible. We addressed this problem by examining unprocessed data from plagioclase/melt partitioning experiments, by comparing models based on that data with existing partitioning models, and evaluated the degree to which the partitioning models are dependent on the calibration data. We found that partitioning models are dependent on the calibration data in ways that result in erroneous model values, and that the error will be systematic and dependent on the value of the partition coefficient. In effect, use of different calibration datasets will result in partitioning models whose results are systematically biased, and that one can arrive at different and conflicting conclusions depending on how a model is calibrated, defeating the purpose of applying the models. Ultimately this is an experimental data problem, which can be solved if we publish individual analyses (not averages) or use a projection method wherein we use an independent compositional constraint to identify and estimate the uncontaminated composition of each phase.
NASA Astrophysics Data System (ADS)
Appel, Marius; Lahn, Florian; Buytaert, Wouter; Pebesma, Edzer
2018-04-01
Earth observation (EO) datasets are commonly provided as collection of scenes, where individual scenes represent a temporal snapshot and cover a particular region on the Earth's surface. Using these data in complex spatiotemporal modeling becomes difficult as soon as data volumes exceed a certain capacity or analyses include many scenes, which may spatially overlap and may have been recorded at different dates. In order to facilitate analytics on large EO datasets, we combine and extend the geospatial data abstraction library (GDAL) and the array-based data management and analytics system SciDB. We present an approach to automatically convert collections of scenes to multidimensional arrays and use SciDB to scale computationally intensive analytics. We evaluate the approach in three study cases on national scale land use change monitoring with Landsat imagery, global empirical orthogonal function analysis of daily precipitation, and combining historical climate model projections with satellite-based observations. Results indicate that the approach can be used to represent various EO datasets and that analyses in SciDB scale well with available computational resources. To simplify analyses of higher-dimensional datasets as from climate model output, however, a generalization of the GDAL data model might be needed. All parts of this work have been implemented as open-source software and we discuss how this may facilitate open and reproducible EO analyses.
ION COMPOSITION ELUCIDATION (ICE) OF IONS FROM ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICALS AND PERSONAL CARE PRODUCTS ...
This presentation briefly summarizes some of what is known and not known about the occurrence of drugs in the environment, the potential for chronic effects on wildlife (and some instances of acute effects), the relevance of drug residues in drinking water to consumer risk perception, and actions that can be taken to reduce environmental exposure. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs.
PHARMACEUTICALS & PERSONAL CARE PRODUCTS AS ...
Those chemical pollutants that are regulated under various international, federal, and state programs represent but a small fraction of the universe of chemicals that occur in the environment as a result of both natural processes and human influence. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-revi
US EPA, NERL-LAS VEGAS ACTIVITIES AND RESEARCH ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
Introduction to Pharmaceuticals and Personal Care Products ...
Those chemical pollutants that are regulated under various international, federal, and state programs represent but a small fraction of the universe of chemicals that occur in the environment as a result of both natural processes and human influence. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-revi
CHEMICAL MARKERS OF HUMAN WASTE ...
Giving public water authorities a tool to monitor and measure levels of human waste contamination of waters simply and rapidly would enhance public protection. This methodology, using both urobilin and azithromycin (or any other human-use pharmaceutical) could be used to give public water authorities a rapid (24- hrs) and definitive method for measuring human waste contamination The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subco
LEVELS OF SYNTHETIC MUSKS COMPOUNDS IN AQUATIC ...
Synthetic musk compounds are consumer chemicals manufactured as fragrance materials Due to their high worldwide usage and release, they frequently occur in the aquatic and marine environments. The U.S. EPA (ORD, Las Vegas) developed surface-water monitoring methodology and conducted a one-year monthly monitoring of synthetic musks in water and biota from Lake Mead (Nevada) as well as from combined sewage effluent streams feeding Lake Mead. Presented are the overview of the chemistry, the monitoring methodology, and the significance of synthetic musk compounds in the aquatic environment. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than p
PRESENTATION ON PPCPS IN THE ENVIRONMENT: AN ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PPCPS IN THE ENVIRONMENT: AN OVERVIEW OF THE ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PHARMACEUTICALS IN THE ENVIRONMENT: A ...
This presentation briefly summarizes some of what is known, and not known about the occurrence of drugs in the environment, the potential for effects on wildlife, the relevance of drug residues in drinking water to consumer risk perception, and actions that can be taken to reduce environmental exposure. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited t
PRESENTED 04/05/2006: MERCURY MEASUREMENTS ...
While traditional methods for determining mercury in solid samples involve the use of aggressive chemicals to dissolve the matrix and the use of other chemicals to properly reduce the mercury to the volatile elemental form, pyrolysis-based analyzers can be used by directly weighing the solid in a sampling boat and initiating the instrumental analysis for total mercury. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee wo
FATE OF SYNTHETIC MUSK COMPOUNDS IN AN AQUATIC ...
To be presented is an overview of the chemistry, the monitoring methodology, and the statistical evaluation of concentrations obtained from the analysis of a suite of these compounds (e.g., Galaxolide®, musk xylene, and amino musk xylene) in different environmental compartments. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations,
TELEPHONIC PRESENTATION: MERCURY ...
While traditional methods for determining mercury in solid samples involve the use of aggressive chemicals to dissolve the matrix and the use of other chemicals to properly reduce the mercury to the volatile elemental form, pyrolysis-based analyzers can be used by directly weighing the solid in a sampling boat and initiating the instrumental analysis for total mercury. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee wo
LEVELS OF SYNTHETIC MUSK COMPOUNDS IN ...
To be presented is an overview of the chemistry, the monitoring methodology, and the statistical evaluation of concentrations obtained from the analysis of a suite of compounds (e.g., Galaxolide®, musk xylene, and amino musk xylene) in an aquatic ecological site. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles
PCCPS IN THE ENVIRONMENT: AN OVERVIEW OF THE ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PPCPS AS UBIQUITOUS POLLUTANTS FROM HEALTH AND ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
PRESENTATION FOR THE PPCPS IN THE ENVIRONMENT ...
There is no abstract available for this product. If further information is requested, please refer to the bibliographic citation and contact the person listed under Contact Field. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews for media, responding to public inquiries. S
CHEMICAL MARKERS OF HUMAN WASTE ...
Giving public water authorities a tool to monitor and measure levels of human waste contamination of waters simply and rapidly would enhance public protection. This methodology, using both urobilin and azithromycin (or any other human-use pharmaceutical) could be used to give public water authorities a rapid (24- hrs) and definitive method for measuring human waste contamination. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subc
IPEDS Analytics: Delta Cost Project Database 1987-2010. Data File Documentation. NCES 2012-823
ERIC Educational Resources Information Center
Lenihan, Colleen
2012-01-01
The IPEDS Analytics: Delta Cost Project Database was created to make data from the Integrated Postsecondary Education Data System (IPEDS) more readily usable for longitudinal analyses. Currently spanning the period from 1987 through 2010, it has a total of 202,800 observations on 932 variables derived from the institutional characteristics,…
Assessment of Learning in Digital Interactive Social Networks: A Learning Analytics Approach
ERIC Educational Resources Information Center
Wilson, Mark; Gochyyev, Perman; Scalise, Kathleen
2016-01-01
This paper summarizes initial field-test results from data analytics used in the work of the Assessment and Teaching of 21st Century Skills (ATC21S) project, on the "ICT Literacy--Learning in digital networks" learning progression. This project, sponsored by Cisco, Intel and Microsoft, aims to help educators around the world enable…
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Waska, R T
1999-01-01
Certain patients, through projective identification and splitting mechanisms, test the boundaries of the analytic situation. These patients are usually experiencing overwhelming paranoid-schizoid anxieties and view the object as ruthless and persecutory. Using a Kleinian perspective, the author advocates greater analytic flexibility with these difficult patients who seem unable to use the standard analytic environment. The concept of self-disclosure is examined, and the author discusses certain technical situations where self-disclosure may be helpful. (The Journal of Psychotherapy Practice and Research 1999; 8:225-233)
1987-09-01
A187 899 A GOAL PROGRANNIN R&D (RESEARCH AND DEVELOPMENT) 1/2 PROJECT FUNDING MODEL 0 (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA S M ANDERSON SEP 87...PROGRAMMING R&D PROJECT FUNDING MODEL OF THE U.S. ARMY STRATEGIC DEFENSE COMMAND USING THE ANALYTIC HIERARCHY PROCESS by Steven M. Anderson September 1987...jACCESSION NO TITI E (Influde Securt ClauAIcatsrn) A Goal Programming R&D Project Funding Model of the U.S. Army Strategic Defense Command Using the
Epp: A C++ EGSnrc user code for x-ray imaging and scattering simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lippuner, Jonas; Elbakri, Idris A.; Cui Congwu
2011-03-15
Purpose: Easy particle propagation (Epp) is a user code for the EGSnrc code package based on the C++ class library egspp. A main feature of egspp (and Epp) is the ability to use analytical objects to construct simulation geometries. The authors developed Epp to facilitate the simulation of x-ray imaging geometries, especially in the case of scatter studies. While direct use of egspp requires knowledge of C++, Epp requires no programming experience. Methods: Epp's features include calculation of dose deposited in a voxelized phantom and photon propagation to a user-defined imaging plane. Projection images of primary, single Rayleigh scattered, singlemore » Compton scattered, and multiple scattered photons may be generated. Epp input files can be nested, allowing for the construction of complex simulation geometries from more basic components. To demonstrate the imaging features of Epp, the authors simulate 38 keV x rays from a point source propagating through a water cylinder 12 cm in diameter, using both analytical and voxelized representations of the cylinder. The simulation generates projection images of primary and scattered photons at a user-defined imaging plane. The authors also simulate dose scoring in the voxelized version of the phantom in both Epp and DOSXYZnrc and examine the accuracy of Epp using the Kawrakow-Fippel test. Results: The results of the imaging simulations with Epp using voxelized and analytical descriptions of the water cylinder agree within 1%. The results of the Kawrakow-Fippel test suggest good agreement between Epp and DOSXYZnrc. Conclusions: Epp provides the user with useful features, including the ability to build complex geometries from simpler ones and the ability to generate images of scattered and primary photons. There is no inherent computational time saving arising from Epp, except for those arising from egspp's ability to use analytical representations of simulation geometries. Epp agrees with DOSXYZnrc in dose calculation, since they are both based on the well-validated standard EGSnrc radiation transport physics model.« less
NASA Technical Reports Server (NTRS)
Sweet, D. C.; Pincura, P. G.; Wukelic, G. E. (Principal Investigator)
1974-01-01
The author has identified the following significant results. During the first year of project effort the ability of ERTS-1 imagery to be used for mapping and inventorying strip-mined areas in south eastern Ohio, the potential of using ERTS-1 imagery in water quality and coastal zone management in the Lake Erie region, and the extent that ERTS-1 imagery could contribute to localized (metropolitan/urban), multicounty, and overall state land use needs were experimentally demonstrated and reported as significant project results. Significant research accomplishments were achieved in the technological development of manual and computerized methods to extract multi-feature information as well as singular feature information from ERTS-1 data as is exemplified by the forestry transparency overlay. Fabrication of an image transfer device to superimpose ERTS-1 data onto existing maps and other data sources was also a significant analytical accomplishment.
Cosmic Dust Collection Facility: Scientific objectives and programmatic relations
NASA Technical Reports Server (NTRS)
Hoerz, Fred (Editor); Brownlee, D. E.; Bunch, T. E.; Grounds, D.; Grun, E.; Rummel, Y.; Quaide, W. L.; Walker, R. M.
1990-01-01
The science objectives are summarized for the Cosmic Dust Collection Facility (CDCF) on Space Station Freedom and these objectives are related to ongoing science programs and mission planning within NASA. The purpose is to illustrate the potential of the CDCF project within the broad context of early solar system sciences that emphasize the study of primitive objects in state-of-the-art analytical and experimental laboratories on Earth. Current knowledge about the sources of cosmic dust and their associated orbital dynamics is examined, and the results are reviewed of modern microanalytical investigations of extraterrestrial dust particles collected on Earth. Major areas of scientific inquiry and uncertainty are identified and it is shown how CDCF will contribute to their solution. General facility and instrument concepts that need to be pursued are introduced, and the major development tasks that are needed to attain the scientific objectives of the CDCF project are identified.
Foundations of measurement and instrumentation
NASA Technical Reports Server (NTRS)
Warshawsky, Isidore
1990-01-01
The user of instrumentation has provided an understanding of the factors that influence instrument performance, selection, and application, and of the methods of interpreting and presenting the results of measurements. Such understanding is prerequisite to the successful attainment of the best compromise among reliability, accuracy, speed, cost, and importance of the measurement operation in achieving the ultimate goal of a project. Some subjects covered are dimensions; units; sources of measurement error; methods of describing and estimating accuracy; deduction and presentation of results through empirical equations, including the method of least squares; experimental and analytical methods of determining the static and dynamic behavior of instrumentation systems, including the use of analogs.
NASA Astrophysics Data System (ADS)
Zhang, Qing-Kun; Wang, Lin; Li, Wei-Min; Gao, Wei-Wei
2015-12-01
The upgrade project of the Hefei Light Source storage ring is under way. In this paper, the broadband impedances of resistive wall and coated ceramic vacuum chamber are calculated using the analytic formula, and the wake fields and impedances of other designed vacuum chambers are simulated by CST code, and then a broadband impedance model is obtained. Using the theoretical formula, longitudinal and transverse single bunch instabilities are discussed. With the carefully-designed vacuum chamber, we find that the thresholds of the beam instabilities are higher than the beam current goal. Supported by Natural Science Foundation of China (11175182, 11175180)
Finding the forest in the trees. The challenge of combining diverse environmental data
NASA Technical Reports Server (NTRS)
1995-01-01
Development of analytical and functional guidelines to help researchers and technicians engaged in interdisciplinary research to better plan and implement their supporting data management activities is addressed. An emphasis is on the projects that involve both geophysical and ecological issues. Six case studies were used to identify and to understand problems associated with collecting, integrating, and analyzing environmental data from local to global spatial scales and over a range of temporal scales. These case studies were also used to elaborate the common barriers to interfacing data of disparate sources and types. A number of lessons derived from the case studies are summarized and analyzed.
Electrospray ion source with reduced analyte electrochemistry
Kertesz, Vilmos [Knoxville, TN; Van Berkel, Gary [Clinton, TN
2011-08-23
An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.
Electrospray ion source with reduced analyte electrochemistry
Kertesz, Vilmos; Van Berkel, Gary J
2013-07-30
An electrospray ion (ESI) source and method capable of ionizing an analyte molecule without oxidizing or reducing the analyte of interest. The ESI source can include an emitter having a liquid conduit, a working electrode having a liquid contacting surface, a spray tip, a secondary working electrode, and a charge storage coating covering partially or fully the liquid contacting surface of the working electrode. The liquid conduit, the working electrode and the secondary working electrode can be in liquid communication. The electrospray ion source can also include a counter electrode proximate to, but separated from, said spray tip. The electrospray ion source can also include a power system for applying a voltage difference between the working electrodes and a counter-electrode. The power system can deliver pulsed voltage changes to the working electrodes during operation of said electrospray ion source to minimize the surface potential of the charge storage coating.
X-ray optics simulation and beamline design for the APS upgrade
NASA Astrophysics Data System (ADS)
Shi, Xianbo; Reininger, Ruben; Harder, Ross; Haeffner, Dean
2017-08-01
The upgrade of the Advanced Photon Source (APS) to a Multi-Bend Achromat (MBA) will increase the brightness of the APS by between two and three orders of magnitude. The APS upgrade (APS-U) project includes a list of feature beamlines that will take full advantage of the new machine. Many of the existing beamlines will be also upgraded to profit from this significant machine enhancement. Optics simulations are essential in the design and optimization of these new and existing beamlines. In this contribution, the simulation tools used and developed at APS, ranging from analytical to numerical methods, are summarized. Three general optical layouts are compared in terms of their coherence control and focusing capabilities. The concept of zoom optics, where two sets of focusing elements (e.g., CRLs and KB mirrors) are used to provide variable beam sizes at a fixed focal plane, is optimized analytically. The effects of figure errors on the vertical spot size and on the local coherence along the vertical direction of the optimized design are investigated.
Higher Order Mode Analysis of the SNS Superconducting Linac
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. Doleans; D. Jeon; S. Kim
2001-06-01
Higher order modes (HOM's) of monopoles, dipoles, quadrupoles and sextupoles in {beta} = 0.61 and {beta} = 0.81 6-cell superconducting (SC) cavities for the Spallation Neutron Source (SNS) project, have been found up to about 3 GHz and their properties such as R/Q, trapping possibility, etc have been figured out in concerning with the manufacturing imperfection. Main issues of HOM's are beam instabilities (published separately) and HOM induced power especially from TM monopoles. The time structure of SNS beam has three different time scales of pulses, which are micro-pulse, midi-pulse and macropulse. Each time structure will generate resonances. When amore » mode is near these resonance frequencies, the induced voltage could be large and accordingly the resulting HOM power, too. In order to understand the effects from such a complex beam time structure on the mode excitation and resulting HOM power, analytic expressions are developed. With these analytic expressions, the induced HOM voltage and HOM power were calculated by assuming external Q for each HOM.« less
Risk analysis for renewable energy projects due to constraints arising
NASA Astrophysics Data System (ADS)
Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.
2016-02-01
Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.
ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress
NASA Technical Reports Server (NTRS)
Kempler, Steven
2015-01-01
The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.
NASA Astrophysics Data System (ADS)
Hagemann, B.; Feldmann, F.; Panfilov, M.; Ganzer, L.
2015-12-01
The change from fossil to renewable energy sources is demanding an increasing amount of storage capacities for electrical energy. A promising technological solution is the storage of hydrogen in the subsurface. Hydrogen can be produced by electrolysis using excessive electrical energy and subsequently converted back into electricity by fuel cells or engine generators. The development of this technology starts with adding small amounts of hydrogen to the high pressure natural gas grid and continues with the creation of pure underground hydrogen storages. The feasibility of hydrogen storage in depleted gas reservoirs is investigated in the lighthouse project H2STORE financed by the German Ministry for Education and Research. The joint research project has project members from the University of Jena, the Clausthal University of Technology, the GFZ Potsdam and the French National Center for Scientic Research in Nancy. The six sub projects are based on laboratory experiments, numerical simulations and analytical work which cover the investigation of mineralogical, geochemical, physio-chemical, sedimentological, microbiological and gas mixing processes in reservoir and cap rocks. The focus in this presentation is on the numerical modeling of underground hydrogen storage. A mathematical model was developed which describes the involved coupled hydrodynamic and microbiological effects. Thereby, the bio-chemical reaction rates depend on the kinetics of microbial growth which is induced by the injection of hydrogen. The model has been numerically implemented on the basis of the open source code DuMuX. A field case study based on a real German gas reservoir was performed to investigate the mixing of hydrogen with residual gases and to discover the consequences of bio-chemical reactions.
Ichihara, Kiyoshi; Ozarda, Yesim; Barth, Julian H; Klee, George; Shimizu, Yoshihisa; Xia, Liangyu; Hoffmann, Mariza; Shah, Swarup; Matsha, Tandi; Wassung, Janette; Smit, Francois; Ruzhanskaya, Anna; Straseski, Joely; Bustos, Daniel N; Kimura, Shogo; Takahashi, Aki
2017-04-01
The intent of this study, based on a global multicenter study of reference values (RVs) for serum analytes was to explore biological sources of variation (SVs) of the RVs among 12 countries around the world. As described in the first part of this paper, RVs of 50 major serum analytes from 13,396 healthy individuals living in 12 countries were obtained. Analyzed in this study were 23 clinical chemistry analytes and 8 analytes measured by immunoturbidimetry. Multiple regression analysis was performed for each gender, country by country, analyte by analyte, by setting four major SVs (age, BMI, and levels of drinking and smoking) as a fixed set of explanatory variables. For analytes with skewed distributions, log-transformation was applied. The association of each source of variation with RVs was expressed as the partial correlation coefficient (r p ). Obvious gender and age-related changes in the RVs were observed in many analytes, almost consistently between countries. Compilation of age-related variations of RVs after adjusting for between-country differences revealed peculiar patterns specific to each analyte. Judged fromthe r p , BMI related changes were observed for many nutritional and inflammatory markers in almost all countries. However, the slope of linear regression of BMI vs. RV differed greatly among countries for some analytes. Alcohol and smoking-related changes were observed less conspicuously in a limited number of analytes. The features of sex, age, alcohol, and smoking-related changes in RVs of the analytes were largely comparable worldwide. The finding of differences in BMI-related changes among countries in some analytes is quite relevant to understanding ethnic differences in susceptibility to nutritionally related diseases. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration
NASA Technical Reports Server (NTRS)
Merritt, D. A.; Brand, W. A.; Hayes, J. M.
1994-01-01
In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).
Negotiating Story Entry: A Micro-Analytic Study of Storytelling Projection in English and Japanese
ERIC Educational Resources Information Center
Yasui, Eiko
2011-01-01
This dissertation offers a micro-analytic study of the use of language and body during storytelling in American English and Japanese conversations. Specifically, I focus on its beginning and explore how a story is "projected." A beginning of an action or activity is where an incipient speaker negotiates the floor with co-participants; they…
Analytical model of tilted driver–pickup coils for eddy current nondestructive evaluation
NASA Astrophysics Data System (ADS)
Cao, Bing-Hua; Li, Chao; Fan, Meng-Bao; Ye, Bo; Tian, Gui-Yun
2018-03-01
A driver-pickup probe possesses better sensitivity and flexibility due to individual optimization of a coil. It is frequently observed in an eddy current (EC) array probe. In this work, a tilted non-coaxial driver-pickup probe above a multilayered conducting plate is analytically modeled with spatial transformation for eddy current nondestructive evaluation. Basically, the core of the formulation is to obtain the projection of magnetic vector potential (MVP) from the driver coil onto the vector along the tilted pickup coil, which is divided into two key steps. The first step is to make a projection of MVP along the pickup coil onto a horizontal plane, and the second one is to build the relationship between the projected MVP and the MVP along the driver coil. Afterwards, an analytical model for the case of a layered plate is established with the reflection and transmission theory of electromagnetic fields. The calculated values from the resulting model indicate good agreement with those from the finite element model (FEM) and experiments, which validates the developed analytical model. Project supported by the National Natural Science Foundation of China (Grant Nos. 61701500, 51677187, and 51465024).
Signals: Applying Academic Analytics
ERIC Educational Resources Information Center
Arnold, Kimberly E.
2010-01-01
Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…
Sedimentary Geothermal Feasibility Study: October 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad; Zerpa, Luis
The objective of this project is to analyze the feasibility of commercial geothermal projects using numerical reservoir simulation, considering a sedimentary reservoir with low permeability that requires productivity enhancement. A commercial thermal reservoir simulator (STARS, from Computer Modeling Group, CMG) is used in this work for numerical modeling. In the first stage of this project (FY14), a hypothetical numerical reservoir model was developed, and validated against an analytical solution. The following model parameters were considered to obtain an acceptable match between the numerical and analytical solutions: grid block size, time step and reservoir areal dimensions; the latter related to boundarymore » effects on the numerical solution. Systematic model runs showed that insufficient grid sizing generates numerical dispersion that causes the numerical model to underestimate the thermal breakthrough time compared to the analytic model. As grid sizing is decreased, the model results converge on a solution. Likewise, insufficient reservoir model area introduces boundary effects in the numerical solution that cause the model results to differ from the analytical solution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
Methods for the behavioral, educational, and social sciences: an R package.
Kelley, Ken
2007-11-01
Methods for the Behavioral, Educational, and Social Sciences (MBESS; Kelley, 2007b) is an open source package for R (R Development Core Team, 2007b), an open source statistical programming language and environment. MBESS implements methods that are not widely available elsewhere, yet are especially helpful for the idiosyncratic techniques used within the behavioral, educational, and social sciences. The major categories of functions are those that relate to confidence interval formation for noncentral t, F, and chi2 parameters, confidence intervals for standardized effect sizes (which require noncentral distributions), and sample size planning issues from the power analytic and accuracy in parameter estimation perspectives. In addition, MBESS contains collections of other functions that should be helpful to substantive researchers and methodologists. MBESS is a long-term project that will continue to be updated and expanded so that important methods can continue to be made available to researchers in the behavioral, educational, and social sciences.
xQuake: A Modern Approach to Seismic Network Analytics
NASA Astrophysics Data System (ADS)
Johnson, C. E.; Aikin, K. E.
2017-12-01
While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.
100-B/C Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.W. Ovink
2010-03-18
This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.
Application of Fuzzy Analytic Hierarchy Process to Building Research Teams
NASA Astrophysics Data System (ADS)
Dąbrowski, Karol; Skrzypek, Katarzyna
2016-03-01
Building teams has a fundamental impact for execution of research and development projects. The teams appointed for the needs of given projects are based on individuals from both inside and outside of the organization. Knowledge is not only a product available on the market but also an intangible resource affecting their internal and external processes. Thus it is vitally important for businesses and scientific research facilities to effectively manage knowledge within project teams. The article presents a proposal to use Fuzzy AHP (Analytic Hierarchy Process) and ANFIS (Adaptive Neuro Fuzzy Inference System) methods in working groups building for R&D projects on the basis of employees skills.
Using Google Analytics to evaluate the impact of the CyberTraining project.
McGuckin, Conor; Crowley, Niall
2012-11-01
A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.
Waska, Robert T.
1999-01-01
Certain patients, through projective identification and splitting mechanisms, test the boundaries of the analytic situation. These patients are usually experiencing overwhelming paranoid-schizoid anxieties and view the object as ruthless and persecutory. Using a Kleinian perspective, the author advocates greater analytic flexibility with these difficult patients who seem unable to use the standard analytic environment. The concept of self-disclosure is examined, and the author discusses certain technical situations where self-disclosure may be helpful.(The Journal of Psychotherapy Practice and Research 1999; 8:225–233) PMID:10413442
Den Hartog, Emiel A; Havenith, George
2010-01-01
For wearers of protective clothing in radiation environments there are no quantitative guidelines available for the effect of a radiative heat load on heat exchange. Under the European Union funded project ThermProtect an analytical effort was defined to address the issue of radiative heat load while wearing protective clothing. As within the ThermProtect project much information has become available from thermal manikin experiments in thermal radiation environments, these sets of experimental data are used to verify the analytical approach. The analytical approach provided a good prediction of the heat loss in the manikin experiments, 95% of the variance was explained by the model. The model has not yet been validated at high radiative heat loads and neglects some physical properties of the radiation emissivity. Still, the analytical approach provides a pragmatic approach and may be useful for practical implementation in protective clothing standards for moderate thermal radiation environments.
Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP)
The Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) provides guidance for the planning, implementation and assessment phases of projects that require laboratory analysis of radionuclides.
A two-dimensional analytical model of vapor intrusion involving vertical heterogeneity.
Yao, Yijun; Verginelli, Iason; Suuberg, Eric M
2017-05-01
In this work, we present an analytical chlorinated vapor intrusion (CVI) model that can estimate source-to-indoor air concentration attenuation by simulating two-dimensional (2-D) vapor concentration profile in vertically heterogeneous soils overlying a homogenous vapor source. The analytical solution describing the 2-D soil gas transport was obtained by applying a modified Schwarz-Christoffel mapping method. A partial field validation showed that the developed model provides results (especially in terms of indoor emission rates) in line with the measured data from a case involving a building overlying a layered soil. In further testing, it was found that the new analytical model can very closely replicate the results of three-dimensional (3-D) numerical models at steady state in scenarios involving layered soils overlying homogenous groundwater sources. By contrast, by adopting a two-layer approach (capillary fringe and vadose zone) as employed in the EPA implementation of the Johnson and Ettinger model, the spatially and temporally averaged indoor concentrations in the case of groundwater sources can be higher than the ones estimated by the numerical model up to two orders of magnitude. In short, the model proposed in this work can represent an easy-to-use tool that can simulate the subsurface soil gas concentration in layered soils overlying a homogenous vapor source while keeping the simplicity of an analytical approach that requires much less computational effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less
Labour Market Driven Learning Analytics
ERIC Educational Resources Information Center
Kobayashi, Vladimer; Mol, Stefan T.; Kismihók, Gábor
2014-01-01
This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.
Increasing the value of geospatial informatics with open approaches for Big Data
NASA Astrophysics Data System (ADS)
Percivall, G.; Bermudez, L. E.
2017-12-01
Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."
Mackey, Sean
2016-01-01
Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155
O'Reilly-Shah, Vikas; Mackey, Sean
2016-06-03
We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.
Big data analytics for the Future Circular Collider reliability and availability studies
NASA Astrophysics Data System (ADS)
Begy, Volodimir; Apollonio, Andrea; Gutleber, Johannes; Martin-Marquez, Manuel; Niemi, Arto; Penttinen, Jussi-Pekka; Rogova, Elena; Romero-Marin, Antonio; Sollander, Peter
2017-10-01
Responding to the European Strategy for Particle Physics update 2013, the Future Circular Collider study explores scenarios of circular frontier colliders for the post-LHC era. One branch of the study assesses industrial approaches to model and simulate the reliability and availability of the entire particle collider complex based on the continuous monitoring of CERN’s accelerator complex operation. The modelling is based on an in-depth study of the CERN injector chain and LHC, and is carried out as a cooperative effort with the HL-LHC project. The work so far has revealed that a major challenge is obtaining accelerator monitoring and operational data with sufficient quality, to automate the data quality annotation and calculation of reliability distribution functions for systems, subsystems and components where needed. A flexible data management and analytics environment that permits integrating the heterogeneous data sources, the domain-specific data quality management algorithms and the reliability modelling and simulation suite is a key enabler to complete this accelerator operation study. This paper describes the Big Data infrastructure and analytics ecosystem that has been put in operation at CERN, serving as the foundation on which reliability and availability analysis and simulations can be built. This contribution focuses on data infrastructure and data management aspects and presents case studies chosen for its validation.
Performance measures for evaluating multi-state projects.
DOT National Transportation Integrated Search
2011-09-01
"Freight transportation projects require an analytic process that considers the impacts of geographic and industry distribution of project : benefits, intermodal impacts, and reliability, as well as the traditional benefits of time savings, safety en...
Rye, Robert O.; Johnson, Craig A.; Landis, Gary P.; Hofstra, Albert H.; Emsbo, Poul; Stricker, Craig A.; Hunt, Andrew G.; Rusk, Brian G.
2010-01-01
Principal functions of the U.S. Geological Survey (USGS) Mineral Resources Program are providing assessments of the location, quantity, and quality of undiscovered mineral deposits, and predicting the environmental impacts of exploration and mine development. The mineral and environmental assessments of domestic deposits are used by planners and decisionmakers to improve the stewardship of public lands and public resources. Assessments of undiscovered mineral deposits on a global scale reveal the potential availability of minerals to the United States and other countries that manufacture goods imported to the United States. These resources are of fundamental relevance to national and international economic and security policy in our globalized world economy. Performing mineral and environmental assessments requires that predictions be made of the likelihood of undiscovered deposits. The predictions are based on geologic and geoenvironmental models that are constructed for the diverse types of mineral deposits from detailed descriptions of actual deposits and detailed understanding of the processes that formed them. Over the past three decades the understanding of ore-forming processes has benefited greatly from the integration of laboratory-based geochemical tools with field observations and other data sources. Under the aegis of the Evolution of Ore Deposits and Technology Transfer Project (referred to hereinafter as the Project), a 5-year effort that terminated in 2008, the Mineral Resources Program provided state-of-the-art analytical capabilities to support applications of several related geochemical tools to ore-deposit-related studies. The analytical capabilities and scientific approaches developed within the Project have wide applicability within Earth-system science. For this reason the Project Laboratories represent a valuable catalyst for interdisciplinary collaborations of the type that should be formed in the coming years for the United States to meet its natural-resources and natural-science needs. This circular presents an overview of the Project. Descriptions of the Project laboratories are given first including descriptions of the types of chemical or isotopic analyses that are made and the utility of the measurements. This is followed by summaries of select measurements that were carried out by the Project scientists. The studies are grouped by science direction. Virtually all of them were collaborations with USGS colleagues or with scientists from other governmental agencies, academia, or the private sector.
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Spidlen, Josef; Brinkman, Ryan R.
2008-02-01
Introduction: The International Society for Analytical Cytology, ISAC, is developing a new combined flow and image Analytical Cytometry Standard (ACS). This standard needs to serve both the research and clinical communities. The clinical medicine and clinical research communities have a need to exchange information with hospital and other clinical information systems. Methods: 1) Prototype the standard by creating CytometryML and a RAW format for binary data. 2) Join the ISAC Data Standards Task Force. 3) Create essential project documentation. 4) Cooperate with other groups by assisting in the preparation of the DICOM Supplement 122: Specimen Module and Pathology Service-Object Pair Classes. Results: CytometryML has been created and serves as a prototype and source of experience for the following: the Analytical Cytometry Standard (ACS) 1.0, the ACS container, Minimum Information about a Flow Cytometry Experiment (MIFlowCyt), and Requirements for a Data File Standard Format to Describe Flow Cytometry and Related Analytical Cytology Data. These requirements provide a means to judge the appropriateness of design elements and to develop tests for the final ACS. The requirements include providing the information required for understanding and reproducing a cytometry experiment or clinical measurement, and for a single standard for both flow and digital microscopic cytometry. Schemas proposed by other members of the ISAC Data Standards Task Force (e.g, Gating-ML) have been independently validated and have been integrated with CytometryML. The use of netCDF as an element of the ACS container has been proposed by others and a suggested method of its use is proposed.
ANALYTICAL CHEMISTRY DIVISION ANNUAL PROGRESS REPORT FOR PERIOD ENDING DECEMBER 31, 1961
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1962-02-01
Research and development progress is reported on analytlcal instrumentation, dlssolver-solution analyses, special research problems, reactor projects analyses, x-ray and spectrochemical analyses, mass spectrometry, optical and electron microscopy, radiochemical analyses, nuclear analyses, inorganic preparations, organic preparations, ionic analyses, infrared spectral studies, anodization of sector coils for the Analog II Cyclotron, quality control, process analyses, and the Thermal Breeder Reactor Projects Analytical Chemistry Laboratory. (M.C.G.)
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
ERIC Educational Resources Information Center
Garofalo, James; Hindelang, Michael J.
The purpose of the document is to identify ways in which National Crime Survey (NCS) data can be used by criminal justice researchers and programs. The report provides an overview of the Application of Victimization Survey Results Project, describes the analytic reports compiled by the project staff, and cites the kinds of systematic information…
KNMI DataLab experiences in serving data-driven innovations
NASA Astrophysics Data System (ADS)
Noteboom, Jan Willem; Sluiter, Raymond
2016-04-01
Climate change research and innovations in weather forecasting rely more and more on (Big) data. Besides increasing data from traditional sources (such as observation networks, radars and satellites), the use of open data, crowd sourced data and the Internet of Things (IoT) is emerging. To deploy these sources of data optimally in our services and products, KNMI has established a DataLab to serve data-driven innovations in collaboration with public and private sector partners. Big data management, data integration, data analytics including machine learning and data visualization techniques are playing an important role in the DataLab. Cross-domain data-driven innovations that arise from public-private collaborative projects and research programmes can be explored, experimented and/or piloted by the KNMI DataLab. Furthermore, advice can be requested on (Big) data techniques and data sources. In support of collaborative (Big) data science activities, scalable environments are offered with facilities for data integration, data analysis and visualization. In addition, Data Science expertise is provided directly or from a pool of internal and external experts. At the EGU conference, gained experiences and best practices are presented in operating the KNMI DataLab to serve data-driven innovations for weather and climate applications optimally.
Consonni, Stefano; Giugliano, Michele; Massarutto, Antonio; Ragazzi, Marco; Saccani, Cesare
2011-01-01
This paper describes the context, the basic assumptions and the main findings of a joint research project aimed at identifying the optimal breakdown between material recovery and energy recovery from municipal solid waste (MSW) in the framework of integrated waste management systems (IWMS). The project was carried out from 2007 to 2009 by five research groups at Politecnico di Milano, the Universities of Bologna and Trento, and the Bocconi University (Milan), with funding from the Italian Ministry of Education, University and Research (MIUR). Since the optimization of IWMSs by analytical methods is practically impossible, the search for the most attractive strategy was carried out by comparing a number of relevant recovery paths from the point of view of mass and energy flows, technological features, environmental impact and economics. The main focus has been on mature processes applicable to MSW in Italy and Europe. Results show that, contrary to a rather widespread opinion, increasing the source separation level (SSL) has a very marginal effects on energy efficiency. What does generate very significant variations in energy efficiency is scale, i.e. the size of the waste-to-energy (WTE) plant. The mere value of SSL is inadequate to qualify the recovery system. The energy and environmental outcome of recovery depends not only on "how much" source separation is carried out, but rather on "how" a given SSL is reached. Copyright © 2011 Elsevier Ltd. All rights reserved.
Measuring research progress in photovoltaics
NASA Technical Reports Server (NTRS)
Jackson, B.; Mcguire, P.
1986-01-01
The role and some results of the project analysis and integration function in the Flat-plate Solar Array (FSA) Project are presented. Activities included supporting the decision-making process, preparation of plans for project direction, setting goals for project activities, measuring progress within the project, and the development and maintenance of analytical models.
"Know Your Well" A Groundwater Quality Project to Inform Students and Well-Owners
NASA Astrophysics Data System (ADS)
Olson, C.; Snow, D.; Samal, A.; Ray, C.; Kreifels, M.
2017-12-01
Over 15 million U.S. households rely on private, household wells for drinking water, and these sources are not protected under the Safe Drinking Water Act. Data on private well water quality is slowly being collected and evaluated from a number of different agencies, sources and projects. A new project is designed both for training high school students and to help assess the quality of water from rural domestic wells in Nebraska. This "crowd sourced" program engaging high school agricultural education programs, FFA chapters, and science classes with students sampling and testing water sampling from rural domestic wells from 12 districts across the state. Students and teachers from selected school were trained through multiple school visits, both in the classroom and in the field. Classroom visits were used to introduce topics such as water quality and groundwater, and testing methods for specific analytes. During the field visit, students were exposed to field techniques, the importance of accuracy in data collection, and what factors might influence the water in sampled wells. High school students learn to sample and test water independently. Leadership and initiative is developed through the program, and many experience the enlightenment that comes with citizen science. A customized mobile app was developed for ease of data entry and visualization, and data uploaded to a secure website where information was stored and compared to laboratory tests of the same measurements. General water quality parameters, including pH, electrical conductivity, major anions are tested in the field and laboratory, as well as environmental contaminants such as arsenic, uranium, pesticides, bacteria. Test kits provided to each class were used by the students to measure selected parameters, and then duplicate water samples were analyzed at a university laboratory. Five high schools are involved in the project during its first year. Nitrate, bacteria and pesticides represent major concerns for private well owners across the U.S. and preliminary results indicate that nitrate concentrations can range up to 70 mg/L, while detections of bacteria and traces of pesticide residues are consistent with other studies. This project will help both high school students and private well owner become better-informed about water quality in Nebraska.
Wang, Hui; Xu, Yanan; Shi, Hongli
2018-03-15
Metal artifacts severely degrade CT image quality in clinical diagnosis, which are difficult to removed, especially for the beam hardening artifacts. The metal artifact reduction (MAR) based on prior images are the most frequently-used methods. However, there exists a lot misclassification in most prior images caused by absence of prior information such as spectrum distribution of X-ray beam source, especially when multiple or big metal are included. This work aims is to identify a more accurate prior image to improve image quality. The proposed method includes four steps. First, the metal image is segmented by thresholding an initial image, where the metal traces are identified in the initial projection data using the forward projection of the metal image. Second, the accurate absorbent model of certain metal image is calculated according to the spectrum distribution of certain X-ray beam source and energy-dependent attenuation coefficients of metal. Third, a new metal image is reconstructed by the general analytical reconstruction algorithm such as filtered back projection (FPB). The prior image is obtained by segmenting the difference image between the initial image and the new metal image into air, tissue and bone. Fourth, the initial projection data are normalized by dividing the projection data of prior image pixel to pixel. The final corrected image is obtained by interpolation, denormalization and reconstruction. Several clinical images with dental fillings and knee prostheses were used to evaluate the proposed algorithm and normalized metal artifact reduction (NMAR) and linear interpolation (LI) method. The results demonstrate the artifacts were reduced efficiently by the proposed method. The proposed method could obtain an exact prior image using the prior information about X-ray beam source and energy-dependent attenuation coefficients of metal. As a result, better performance of reducing beam hardening artifacts can be achieved. Moreover, the process of the proposed method is rather simple and little extra calculation burden is necessary. It has superiorities over other algorithms when include multiple and/or big implants.
The Human is the Loop: New Directions for Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Hossain, Shahriar H.; Ramakrishnan, Naren
2014-01-28
Visual analytics is the science of marrying interactive visualizations and analytic algorithms to support exploratory knowledge discovery in large datasets. We argue for a shift from a ‘human in the loop’ philosophy for visual analytics to a ‘human is the loop’ viewpoint, where the focus is on recognizing analysts’ work processes, and seamlessly fitting analytics into that existing interactive process. We survey a range of projects that provide visual analytic support contextually in the sensemaking loop, and outline a research agenda along with future challenges.
NASA Technical Reports Server (NTRS)
King, R. B.; Fordyce, J. S.; Antoine, A. C.; Leibecki, H. F.; Neustadter, H. E.; Sidik, S. M.
1976-01-01
Concentrations of 60 chemical elements in the airborne particulate matter were measured at 16 sites in Cleveland, OH over a 1 year period during 1971 and 1972 (45 to 50 sampling days). Analytical methods used included instrumental neutron activation, emission spectroscopy, and combustion techniques. Uncertainties in the concentrations associated with the sampling procedures, the analytical methods, the use of several analytical facilities, and samples with concentrations below the detection limits are evaluated in detail. The data are discussed in relation to other studies and source origins. The trace constituent concentrations as a function of wind direction are used to suggest a practical method for air pollution source identification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, F; Park, J; Barraclough, B
2016-06-15
Purpose: To develop an efficient and accurate independent dose calculation algorithm with a simplified analytical source model for the quality assurance and safe delivery of Flattening Filter Free (FFF)-IMRT on an Elekta Versa HD. Methods: The source model consisted of a point source and a 2D bivariate Gaussian source, respectively modeling the primary photons and the combined effect of head scatter, monitor chamber backscatter and collimator exchange effect. The in-air fluence was firstly calculated by back-projecting the edges of beam defining devices onto the source plane and integrating the visible source distribution. The effect of the rounded MLC leaf end,more » tongue-and-groove and interleaf transmission was taken into account in the back-projection. The in-air fluence was then modified with a fourth degree polynomial modeling the cone-shaped dose distribution of FFF beams. Planar dose distribution was obtained by convolving the in-air fluence with a dose deposition kernel (DDK) consisting of the sum of three 2D Gaussian functions. The parameters of the source model and the DDK were commissioned using measured in-air output factors (Sc) and cross beam profiles, respectively. A novel method was used to eliminate the volume averaging effect of ion chambers in determining the DDK. Planar dose distributions of five head-and-neck FFF-IMRT plans were calculated and compared against measurements performed with a 2D diode array (MapCHECK™) to validate the accuracy of the algorithm. Results: The proposed source model predicted Sc for both 6MV and 10MV with an accuracy better than 0.1%. With a stringent gamma criterion (2%/2mm/local difference), the passing rate of the FFF-IMRT dose calculation was 97.2±2.6%. Conclusion: The removal of the flattening filter represents a simplification of the head structure which allows the use of a simpler source model for very accurate dose calculation. The proposed algorithm offers an effective way to ensure the safe delivery of FFF-IMRT.« less
Retention of minority participants in clinical research studies.
Keller, Colleen S; Gonzales, Adelita; Fleuriet, K Jill
2005-04-01
Recruitment of minority participants for clinical research studies has been the topic of several analytical works. Yet retention of participants, most notably minority and underserved populations, is less reported and understood, even though these populations have elevated health risks. This article describes two related, intervention-based formative research projects in which researchers used treatment theory to address issues of recruitment and retention of minority women participants in an exercise program to reduce obesity. Treatment theory incorporates a model of health promotion that allows investigators to identify and control sources of extraneous variables. The authors' research demonstrates that treatment theory can improve retention of minority women participants by considering critical inputs, mediating processes, and substantive participant characteristics in intervention design.
Sensitivity and systematics of calorimetric neutrino mass experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nucciotti, A.; Cremonesi, O.; Ferri, E.
2009-12-16
A large calorimetric neutrino mass experiment using thermal detectors is expected to play a crucial role in the challenge for directly assessing the neutrino mass. We discuss and compare here two approaches for the estimation of the experimental sensitivity of such an experiment. The first method uses an analytic formulation and allows to obtain readily a close estimate over a wide range of experimental configurations. The second method is based on a Montecarlo technique and is more precise and reliable. The Montecarlo approach is then exploited to study some sources of systematic uncertainties peculiar to calorimetric experiments. Finally, the toolsmore » are applied to investigate the optimal experimental configuration of the MARE project.« less
NASA Astrophysics Data System (ADS)
Deuerlein, Jochen; Meyer-Harries, Lea; Guth, Nicolai
2017-07-01
Drinking water distribution networks are part of critical infrastructures and are exposed to a number of different risks. One of them is the risk of unintended or deliberate contamination of the drinking water within the pipe network. Over the past decade research has focused on the development of new sensors that are able to detect malicious substances in the network and early warning systems for contamination. In addition to the optimal placement of sensors, the automatic identification of the source of a contamination is an important component of an early warning and event management system for security enhancement of water supply networks. Many publications deal with the algorithmic development; however, only little information exists about the integration within a comprehensive real-time event detection and management system. In the following the analytical solution and the software implementation of a real-time source identification module and its integration within a web-based event management system are described. The development was part of the SAFEWATER project, which was funded under FP 7 of the European Commission.
Integration of SAR and DEM data: Geometrical considerations
NASA Technical Reports Server (NTRS)
Kropatsch, Walter G.
1991-01-01
General principles for integrating data from different sources are derived from the experience of registration of SAR images with digital elevation models (DEM) data. The integration consists of establishing geometrical relations between the data sets that allow us to accumulate information from both data sets for any given object point (e.g., elevation, slope, backscatter of ground cover, etc.). Since the geometries of the two data are completely different they cannot be compared on a pixel by pixel basis. The presented approach detects instances of higher level features in both data sets independently and performs the matching at the high level. Besides the efficiency of this general strategy it further allows the integration of additional knowledge sources: world knowledge and sensor characteristics are also useful sources of information. The SAR features layover and shadow can be detected easily in SAR images. An analytical method to find such regions also in a DEM needs in addition the parameters of the flight path of the SAR sensor and the range projection model. The generation of the SAR layover and shadow maps is summarized and new extensions to this method are proposed.
Mid- and long-term debris environment projections using the EVOLVE and CHAIN models
NASA Astrophysics Data System (ADS)
Eichler, Peter; Reynolds, Robert C.
1995-06-01
Results of debris environment projections are of great importance for the evaluation of the necessity and effectiveness of debris mitigation measures. EVOLVE and CHAIN are two models for debris environment projections that have been developed independently using different conceptual approaches. A comparison of results from these two models therefore provides a means of validating debris environment projections which they have made. EVOLVE is a model that requires mission model projections to describe future space operation; these projections include launch date, mission orbit altitude and inclimation, mission duration, vehicle size and mass, and classification as an object capable of experiencing breakup from on-board stored energy. EVOLVE describes the orbital debris environment by the orbital elements of the objects in the environment. CHAIN is an analytic model that bins the debris environemnt in size and altitude rather than following the orbit evolution of individual debris fragments. The altitude/size bins are coupled by the initial spreading of fragments by collisions and the following orbital decay behavior. A set of test cases covering a variety of space usage scenarios have been defined for the two models. In this paper, a comparison of the results will be presented and sources of disagreement identified and discussed. One major finding is that despite differences in the results of the two models, the basic tendencies of the environment projections are independent of modeled uncertainties, leading to the demand of debris mitigation measures--explosion suppression and de-orbit of rocket bodies and payloads after mission completion.
Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...
Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...
Hardcastle, Chris D; Harris, Joel M
2015-08-04
The ability of a vesicle membrane to preserve a pH gradient, while allowing for diffusion of neutral molecules across the phospholipid bilayer, can provide the isolation and preconcentration of ionizable compounds within the vesicle interior. In this work, confocal Raman microscopy is used to observe (in situ) the pH-gradient preconcentration of compounds into individual optically trapped vesicles that provide sub-femtoliter collectors for small-volume samples. The concentration of analyte accumulated in the vesicle interior is determined relative to a perchlorate-ion internal standard, preloaded into the vesicle along with a high-concentration buffer. As a guide to the experiments, a model for the transfer of analyte into the vesicle based on acid-base equilibria is developed to predict the concentration enrichment as a function of source-phase pH and analyte concentration. To test the concept, the accumulation of benzyldimethylamine (BDMA) was measured within individual 1 μm phospholipid vesicles having a stable initial pH that is 7 units lower than the source phase. For low analyte concentrations in the source phase (100 nM), a concentration enrichment into the vesicle interior of (5.2 ± 0.4) × 10(5) was observed, in agreement with the model predictions. Detection of BDMA from a 25 nM source-phase sample was demonstrated, a noteworthy result for an unenhanced Raman scattering measurement. The developed model accurately predicts the falloff of enrichment (and measurement sensitivity) at higher analyte concentrations, where the transfer of greater amounts of BDMA into the vesicle titrates the internal buffer and decreases the pH gradient. The predictable calibration response over 4 orders of magnitude in source-phase concentration makes it suitable for quantitative analysis of ionizable compounds from small-volume samples. The kinetics of analyte accumulation are relatively fast (∼15 min) and are consistent with the rate of transfer of a polar aromatic molecule across a gel-phase phospholipid membrane.
ThinkHazard!: an open-source, global tool for understanding hazard information
NASA Astrophysics Data System (ADS)
Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone
2016-04-01
Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.
NASA Astrophysics Data System (ADS)
Martin, D.; Shallcross, D.; Nickless, G.; White, I.
2005-12-01
Transport, dispersion and ultimate fate of pollutants has very important implications for the environment at the urban, regional and global scales. Localised emissions of both man-made and naturally produced pollutants can both directly and indirectly impact the health of the inhabitants. The DAPPLE (Dispersion of Air Pollutants and their Penetration into the Local Environment) consortium consists of six universities, which comprises of a multidisciplinary approach to study relatively small-scale urban atmospheric dispersion. Wind tunnel modelling studies, computer fluid dynamical simulations, fieldwork studies using tracers and dispersion modelling were all carried out in an attempt to achieve this. In this paper we report on tracer dispersion experiments carried out in May 2003 and June 2004. These involve the release of various perfluorocarbon (PFC) tracers centred on Marylebone Road in London. These compounds are inert, non-reactive and have a very low atmospheric background concentration with little variability. These properties make them the ideal atmospheric tracer and this combined with an ultra sensitive analytical technique (sample pre-concentration on carbon based adsorbents followed with detection by Negative Ion Chemical Ionization Mass Spectrometry) makes very small release amounts feasible. The source-receptor relationship is studied for various source and receptor positions and distances. Source receptor relationships for both rooftop and indoor positions were evaluated as part of the project. Results of concurrent meteorological measurements are also presented as well as comparison with a number of simple dispersion models.
Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption
ERIC Educational Resources Information Center
Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane
2014-01-01
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…
External Standards or Standard Addition? Selecting and Validating a Method of Standardization
NASA Astrophysics Data System (ADS)
Harvey, David T.
2002-05-01
A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.
Planning for Low End Analytics Disruptions in Business School Curricula
ERIC Educational Resources Information Center
Rienzo, Thomas; Chen, Kuanchin
2018-01-01
Analytics is getting a great deal of attention in both industrial and academic venues. Organizations of all types are becoming more serious about transforming data from a variety of sources into insight, and analytics is the key to that transformation. Academic institutions are rapidly responding to the demand for analytics talent, with hundreds…
Wang, Quanxin; Sporns, Olaf; Burkhalter, Andreas
2012-01-01
Much of the information used for visual perception and visually guided actions is processed in complex networks of connections within the cortex. To understand how this works in the normal brain and to determine the impact of disease, mice are promising models. In primate visual cortex, information is processed in a dorsal stream specialized for visuospatial processing and guided action and a ventral stream for object recognition. Here, we traced the outputs of 10 visual areas and used quantitative graph analytic tools of modern network science to determine, from the projection strengths in 39 cortical targets, the community structure of the network. We found a high density of the cortical graph that exceeded that previously shown in monkey. Each source area showed a unique distribution of projection weights across its targets (i.e. connectivity profile) that was well-fit by a lognormal function. Importantly, the community structure was strongly dependent on the location of the source area: outputs from medial/anterior extrastriate areas were more strongly linked to parietal, motor and limbic cortex, whereas lateral extrastriate areas were preferentially connected to temporal and parahippocampal cortex. These two subnetworks resemble dorsal and ventral cortical streams in primates, demonstrating that the basic layout of cortical networks is conserved across species. PMID:22457489
1987-02-20
Bacteriology; 8 years professional experience; served as Project Health and Safety Officer. 1-37 o Duane R. Boline - Ph.D. in Analytical Chemistry ; M.S. in... Chemistry ; B.S.E. in Physical Science; 18 years professional experience; served as Project Quality Assurance Officer. Complete biographical data...University, 1962 M.S., Chemistry , Einporia State University 1965 Ph.D., Analytical Chemistry , Kansas State University, 1975
Pavement Performance : Approaches Using Predictive Analytics
DOT National Transportation Integrated Search
2018-03-23
Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...
Project Management in NASA: The system and the men
NASA Technical Reports Server (NTRS)
Pontious, R. H.; Barnes, L. B.
1973-01-01
An analytical description of the NASA project management system is presented with emphasis on the human element. The NASA concept of project management, program managers, and the problems and strengths of the NASA system are discussed.
Muehlwald, S; Buchner, N; Kroh, L W
2018-03-23
Because of the high number of possible pesticide residues and their chemical complexity, it is necessary to develop methods which cover a broad range of pesticides. In this work, a qualitative multi-screening method for pesticides was developed by use of HPLC-ESI-Q-TOF. 110 pesticides were chosen for the creation of a personal compound database and library (PCDL). The MassHunter Qualitative Analysis software from Agilent Technologies was used to identify the analytes. The software parameter settings were optimised to produce a low number of false positive as well as false negative results. The method was validated for 78 selected pesticides. However, the validation criteria were not fulfilled for 45 analytes. Due to this result, investigations were started to elucidate reasons for the low detectability. It could be demonstrated that the three main causes of the signal suppression were the co-eluting matrix (matrix effect), the low sensitivity of the analyte in standard solution and the fragmentation of the analyte in the ion source (in-source collision-induced dissociation). In this paper different examples are discussed showing that the impact of these three causes is different for each analyte. For example, it is possible that an analyte with low signal intensity and an intense fragmentation in the ion source is detectable in a difficult matrix, whereas an analyte with a high sensitivity and a low fragmentation is not detectable in a simple matrix. Additionally, it could be shown that in-source fragments are a helpful tool for an unambiguous identification. Copyright © 2018 Elsevier B.V. All rights reserved.
Current projects in Pre-analytics: where to go?
Sapino, Anna; Annaratone, Laura; Marchiò, Caterina
2015-01-01
The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.
Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective
Mench-Bressan, Nadja; McGregor, Carolyn; Pugh, James Edward
2015-01-01
The effective use of data within intensive care units (ICUs) has great potential to create new cloud-based health analytics solutions for disease prevention or earlier condition onset detection. The Artemis project aims to achieve the above goals in the area of neonatal ICUs (NICU). In this paper, we proposed an analytical model for the Artemis cloud project which will be deployed at McMaster Children’s Hospital in Hamilton. We collect not only physiological data but also the infusion pumps data that are attached to NICU beds. Using the proposed analytical model, we predict the amount of storage, memory, and computation power required for the system. Capacity planning and tradeoff analysis would be more accurate and systematic by applying the proposed analytical model in this paper. Numerical results are obtained using real inputs acquired from McMaster Children’s Hospital and a pilot deployment of the system at The Hospital for Sick Children (SickKids) in Toronto. PMID:27170907
Pilot testing of SHRP 2 reliability data and analytical products: Washington. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The Washington site used the reliability guide from Project L02, analysis tools for forecasting reliability and estimating impacts from Project L07, Project L08, and Project C11 as well as the guide on reliability performance measures from the Projec...
40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.
Code of Federal Regulations, 2013 CFR
2013-07-01
... for the presence of E. coli, enterococci, or coliphage: Analytical Methods for Source Water Monitoring... Microbiology, 62:3881-3884. 10 EPA Method 1601: Male-specific (F+) and Somatic Coliphage in Water by Two-step... 20460. 11 EPA Method 1602: Male-specific (F+) and Somatic Coliphage in Water by Single Agar Layer (SAL...
40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.
Code of Federal Regulations, 2014 CFR
2014-07-01
... for the presence of E. coli, enterococci, or coliphage: Analytical Methods for Source Water Monitoring... Microbiology, 62:3881-3884. 10 EPA Method 1601: Male-specific (F+) and Somatic Coliphage in Water by Two-step... 20460. 11 EPA Method 1602: Male-specific (F+) and Somatic Coliphage in Water by Single Agar Layer (SAL...
Humidity Effects on Fragmentation in Plasma-Based Ambient Ionization Sources
NASA Astrophysics Data System (ADS)
Newsome, G. Asher; Ackerman, Luke K.; Johnson, Kevin J.
2016-01-01
Post-plasma ambient desorption/ionization (ADI) sources are fundamentally dependent on surrounding water vapor to produce protonated analyte ions. There are two reports of humidity effects on ADI spectra. However, it is unclear whether humidity will affect all ADI sources and analytes, and by what mechanism humidity affects spectra. Flowing atmospheric pressure afterglow (FAPA) ionization and direct analysis in real time (DART) mass spectra of various surface-deposited and gas-phase analytes were acquired at ambient temperature and pressure across a range of observed humidity values. A controlled humidity enclosure around the ion source and mass spectrometer inlet was used to create programmed humidity and temperatures. The relative abundance and fragmentation of molecular adduct ions for several compounds consistently varied with changing ambient humidity and also were controlled with the humidity enclosure. For several compounds, increasing humidity decreased protonated molecule and other molecular adduct ion fragmentation in both FAPA and DART spectra. For others, humidity increased fragment ion ratios. The effects of humidity on molecular adduct ion fragmentation were caused by changes in the relative abundances of different reagent protonated water clusters and, thus, a change in the average difference in proton affinity between an analyte and the population of water clusters. Control of humidity in ambient post-plasma ion sources is needed to create spectral stability and reproducibility.
Humidity Effects on Fragmentation in Plasma-Based Ambient Ionization Sources.
Newsome, G Asher; Ackerman, Luke K; Johnson, Kevin J
2016-01-01
Post-plasma ambient desorption/ionization (ADI) sources are fundamentally dependent on surrounding water vapor to produce protonated analyte ions. There are two reports of humidity effects on ADI spectra. However, it is unclear whether humidity will affect all ADI sources and analytes, and by what mechanism humidity affects spectra. Flowing atmospheric pressure afterglow (FAPA) ionization and direct analysis in real time (DART) mass spectra of various surface-deposited and gas-phase analytes were acquired at ambient temperature and pressure across a range of observed humidity values. A controlled humidity enclosure around the ion source and mass spectrometer inlet was used to create programmed humidity and temperatures. The relative abundance and fragmentation of molecular adduct ions for several compounds consistently varied with changing ambient humidity and also were controlled with the humidity enclosure. For several compounds, increasing humidity decreased protonated molecule and other molecular adduct ion fragmentation in both FAPA and DART spectra. For others, humidity increased fragment ion ratios. The effects of humidity on molecular adduct ion fragmentation were caused by changes in the relative abundances of different reagent protonated water clusters and, thus, a change in the average difference in proton affinity between an analyte and the population of water clusters. Control of humidity in ambient post-plasma ion sources is needed to create spectral stability and reproducibility.
NASA Astrophysics Data System (ADS)
Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min
2017-09-01
The Halbach type hollow cylindrical permanent magnet array (HCPMA) is a volume compact and energy conserved field source, which have attracted intense interests in many practical applications. Here, using the complex variable integration method based on the Biot-Savart Law (including current distributions inside the body and on the surfaces of magnet), we derive analytical field solutions to an ideal multipole HCPMA in entire space including the interior of magnet. The analytic field expression inside the array material is used to construct an analytic demagnetization function, with which we can explain the origin of demagnetization phenomena in HCPMA by taking into account an ideal magnetic hysteresis loop with finite coercivity. These analytical field expressions and demagnetization functions provide deeper insight into the nature of such permanent magnet array systems and offer guidance in designing optimized array system.
Analytical approach of laser beam propagation in the hollow polygonal light pipe.
Zhu, Guangzhi; Zhu, Xiao; Zhu, Changhong
2013-08-10
An analytical method of researching the light distribution properties on the output end of a hollow n-sided polygonal light pipe and a light source with a Gaussian distribution is developed. The mirror transformation matrices and a special algorithm of removing void virtual images are created to acquire the location and direction vector of each effective virtual image on the entrance plane. The analytical method is demonstrated by Monte Carlo ray tracing. At the same time, four typical cases are discussed. The analytical results indicate that the uniformity of light distribution varies with the structural and optical parameters of the hollow n-sided polygonal light pipe and light source with a Gaussian distribution. The analytical approach will be useful to design and choose the hollow n-sided polygonal light pipe, especially for high-power laser beam homogenization techniques.
Sampling probe for microarray read out using electrospray mass spectrometry
Van Berkel, Gary J.
2004-10-12
An automated electrospray based sampling system and method for analysis obtains samples from surface array spots having analytes. The system includes at least one probe, the probe including an inlet for flowing at least one eluting solvent to respective ones of a plurality of spots and an outlet for directing the analyte away from the spots. An automatic positioning system is provided for translating the probe relative to the spots to permit sampling of any spot. An electrospray ion source having an input fluidicly connected to the probe receives the analyte and generates ions from the analyte. The ion source provides the generated ions to a structure for analysis to identify the analyte, preferably being a mass spectrometer. The probe can be a surface contact probe, where the probe forms an enclosing seal along the periphery of the array spot surface.
Dynamic mobility applications analytical needs assessment.
DOT National Transportation Integrated Search
2012-07-01
Dynamic Mobility Applications Analytical Needs Assessment was a one-year project (July 2011 to July 2012) to develop a strategy for assessing the potential impact of twenty-eight applications for improved mobility across national transportation syste...
Field Sampling and Selecting On-Site Analytical Methods for Explosives in Soil
The purpose of this issue paper is to provide guidance to Remedial Project Managers regarding field sampling and on-site analytical methods fordetecting and quantifying secondary explosive compounds in soils.
Monitoring IACP samples and construction of a centralized data base
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, D.B.; Ray, D.B.; Simonson, J.
1991-01-01
The Integrated Air Cancer Project (IACP) is a multiyear US EPA research program established to develop and evaluate methods required to identify the principal airborne carcinogens, determine emission sources, and improve the estimate of comparative human cancer risk. The first major field study designed to examine a residential wood combustion airshed was conducted in Boise, Idaho during the 1986-1987 winter heating season. The second major field study conducted in Roanoke, Virgina during the 1988-1989 was to study residential oil heating and wood combustion. Motor vehicle emissions were considered a major combustion product contributor in both airsheds. This paper describes twomore » critical components of the project. The first component is the sample custody and tracking of the samples before analysis. The second component describes the data management of the sample field data (eg. sample site, time, date, flow rate) as well as the analytical data (eg. mutagenicity, particle concentrations) for the environmental samples.« less
Exploring middle school students' use of inscriptions in project-based science classrooms
NASA Astrophysics Data System (ADS)
Wu, Hsin-Kai; Krajcik, Joseph S.
2006-09-01
This study explores seventh graders' use of inscriptions in a teacher-designed project-based science unit. To investigate students' learning practices during the 8-month water quality unit, we collected multiple sources of data (e.g., classroom video recordings, student artifacts, and teacher interviews) and employed analytical methods that drew from a naturalistic approach. The findings showed that throughout the unit, provided with the teachers' scaffold and social, conceptual, and material resources, the seventh graders were able to use various inscriptions (e.g., digital pictures, Web pages, and models) to demonstrate meaningful inscriptional practices such as creating and using inscriptions to make arguments, to represent conceptual understandings, and to engage in thoughtful discussions. Inscriptions and associated practices provided students with experiences and understandings about certain ways to organize, transform, and link data or scientific ideas. However, when constructing inscriptions, students did not consider how the inscriptions could serve certain reasoning purposes. In addition, more scaffolds were needed to help students use multiple inscriptions to make a coherent argument.
NASA Astrophysics Data System (ADS)
Daly, Aoife; Streeton, Noëlle L. W.
2017-06-01
A technique for non-invasive dendrochronological analysis of oak was developed for archaeological material, using an industrial CT scanner. Since 2013, this experience has been extended within the scope of the research project `After the Black Death: Painting and Polychrome Sculpture in Norway'. The source material for the project is a collection of late-medieval winged altarpieces, shrines, polychrome sculpture, and fragments from Norwegian churches, which are owned by the Museum of Cultural History, University of Oslo. The majority cannot be sampled, and many are too large to fit into the CT scanner. For these reasons, a combined approach was adopted, utilizing CT scanning where possible, but preceded by an `exposed-wood' imaging technique. Both non-invasive techniques have yielded reliable results, and CT scanning has confirmed the reliability of the imaging technique alone. This paper presents the analytical methods, along with results from two of the 13 objects under investigation. Results for reliable dates and provenances provide new foundations for historical interpretations.
MIT CSAIL and Lincoln Laboratory Task Force Report
2016-08-01
projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to
1985-06-28
1984 to April 1985 includedr installation of 27 raw mnitor wells and 11 amh l’,(1 zjit~s, collecticn cf seive~t, sarples ftw surfac , soil, shallowv...modifying a sampling and analytical program that addresses the requirements of the project. If project requirements necessitate different quality...reagent blank and at least five (5) different concentrations of thl analyte. A modification of the method of Hubaux and Vos will be used to deter
Thermal Performance Analysis of a Geologic Borehole Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reagin, Lauren
2016-08-16
The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of twomore » WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to being independent of mesh size. The results from the computational case and analytically-calculated case for the homogeneous WP in benchmarking were almost identical, which indicates that the computational approach used here was successfully verified by the analytical solution.« less
Usmanov, Dilshadbek T; Yu, Zhan; Chen, Lee Chuin; Hiraoka, Kenzo; Yamabe, Shinichi
2016-02-01
In this work, a low-pressure air dielectric-barrier discharge (DBD) ion source using a capillary with the inner diameter of 0.115 and 12 mm long applicable to miniaturized mass spectrometers was developed. The analytes, trinitrotoluene (TNT), 1,3,5-trinitroperhydro-1,3,5-triazine (RDX), 1,3,5,7-tetranitroperhydro-1,3,5,7-tetrazocine (HMX), pentaerythritol tetranitrate (PETN), nitroglycerine (NG), hexamethylene triperoxide diamine (HMTD), caffeine, cocaine and morphine, introduced through the capillary, were ionized by a low-pressure air DBD. The ion source pressures were changed by using various sizes of the ion sampling orifice. The signal intensities of those analytes showed marked pressure dependence. TNT was detected with higher sensitivity at lower pressure but vice versa for other analytes. For all analytes, a marked signal enhancement was observed when a grounded cylindrical mesh electrode was installed in the DBD ion source. Among nine analytes, RDX, HMX, NG and PETN could be detected as cluster ions [analyte + NO3 ](-) even at low pressure and high temperature up to 180 °C. The detection indicates that these cluster ions are stable enough to survive under present experimental conditions. The unexpectedly high stabilities of these cluster ions were verified by density functional theory calculation. Copyright © 2016 John Wiley & Sons, Ltd.
Source-term development for a contaminant plume for use by multimedia risk assessment models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, Gene; McDonald, John P.; Taira, Randal Y.
1999-12-01
Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less
Adamson, J; Newton, J; Steffey, B; Cai, J; Adamovics, J; Oldham, M; Chino, J; Craciunescu, O
2012-06-01
To determine the characteristics of a new commercially available CT-compatible LDR Tandem and Ovoid (T&O) applicator using 3D dosimetry. We characterized source attenuation through the asymmetric gold shielding in the buckets by measuring dose with diode and 3D dosimetry and compared to an analytical line integral calculation. For 3D dosimetry, a cylindrical PRESAGE dosimeter (9.5cm diameter, 9.2cm height) with a central 6mm channel bored for source placement was scanned with the Duke Large field of view Optical CT-Scanner (DLOS) before and after delivering a nominal 7.7Gy at a distance of 1 cm using a Cs-137 source loaded in the bucket. The optical CT scan time lasted approximately 15 minutes during which 720 projections were acquired at 0.5° increments, anda 3D dose distribution was reconstructed with a 0.5mm 3 isotropic voxel size. The 3D dose distribution was applied to a CT-based T&O implant to determine effect of ovoid shielding on the dose delivered to ICRU 38 Point A as well as D2cc of the bladder, rectum, bowel, and sigmoid. Dose transmission through the gold shielding at a radial distance of 1-3cm from midplane of the source was 86.6%, 86.1, and 87.0% for analytical calculation, diode, and 3D dosimetry, respectively. For the gold shielding of the bucket, dose transmission calculated using the 3D dosimetrymeasurement was found to be lowest at oblique angles from the bucket witha minimum of ∼51%. For the patient case, attenuation from the buckets leadto a decrease in average Point A dose of ∼4% and decrease in D2cc to bladder, rectum, sigmoid, and bowel of 2%, 15%, 2%, and 7%, respectively. The measured 3D dose distribution provided unique insight to the dosimetry and shielding characteristics of the investigated applicator, the technique for which can be applied to commissioning of other brachytherapy applicators. John Adamovics is the owner of Heuris Pharma LLC. Partially supported by NIH Grant R01 CA100835-01. © 2012 American Association of Physicists in Medicine.
NASA Technical Reports Server (NTRS)
Lueck, Dale E.; Captain, Janine E.; Gibson, Tracy L.; Peterson, Barbara V.; Berger, Cristina M.; Levine, Lanfang
2008-01-01
The RESOLVE project requires an analytical system to identify and quantitate the volatiles released from a lunar drill core sample as it is crushed and heated to 150 C. The expected gases and their range of concentrations were used to assess Gas Chromatography (GC) and Mass Spectrometry (MS), along with specific analyzers for use on this potential lunar lander. The ability of these systems to accurately quantitate water and hydrogen in an unknown matrix led to the selection of a small MEMS commercial process GC for use in this project. The modification, development and testing of this instrument for the specific needs of the project is covered.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach
NASA Astrophysics Data System (ADS)
Ramezani, Zeinab; Orouji, Ali A.
2018-01-01
A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.
Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols
NASA Astrophysics Data System (ADS)
Shafer, M. M.; Schauer, J. J.; Park, J.
2001-12-01
Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.
Analytical methods for gelatin differentiation from bovine and porcine origins and food products.
Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B
2012-01-01
Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®
ERIC Educational Resources Information Center
Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin
2016-01-01
In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…
Influence of consumers' cognitive style on results from projective mapping.
Varela, Paula; Antúnez, Lucía; Berget, Ingunn; Oliveira, Denize; Christensen, Kasper; Vidal, Leticia; Naes, Tormod; Ares, Gastón
2017-09-01
Projective mapping (PM), one of the most holistic product profiling methods in approach, is increasingly being used to uncover consumers' perception of products and packages. Assessors rely on a process of synthesis for evaluating product information, which would determine the relative importance of the perceived characteristics they use for mapping them. Individual differences are expected, as participants are not instructed on the characteristics to consider for evaluating the degree of difference among samples, generating different perceptual spaces. Individual differences in cognitive style can affect synthesis processes and thus their perception of similarities and differences among samples. In this study, the influence of the cognitive style in the results of PM was explored. Two consumer studies were performed, one aimed at describing intrinsic sensory characteristics of chocolate flavoured milk and the other one looking into extrinsic (package only) of blueberry yogurts. Consumers completed the wholistic-analytic module of the extended Verbal Imagery Cognitive Styles Test & Extended Cognitive Style Analysis-Wholistic Analytic Test, to characterize their cognitive style. Differences between wholistic and analytic consumers in how they evaluated samples using projective mapping were found in both studies. Analytics separated the samples more in the PM perceptual space than wholistic consumers, showing more discriminating abilities. This may come from a deeper analysis of the samples, both from intrinsic and extrinsic point of views. From a sensory perspective (intrinsic), analytic consumers relied on more sensory characteristics, while wholistic mainly discriminated samples according to sweetness and bitterness/chocolate flavour. In the extrinsic study however, even if analytic consumers discriminated more between packs, they described the products using similar words in the descriptive step. One important recommendation coming from this study is the need to consider higher dimensions in the interpretation of projective mapping tasks, as the first dimensions could underestimate the complexity of the perceptual space; currently, most applications of PM consider two dimensions only, which may not uncover the perception of specific groups of consumers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hatch, Joseph R.; Bullock, John H.; Finkelman, Robert B.
2006-01-01
In 1999, the USGS initiated the National Coal Quality Inventory (NaCQI) project to address a need for quality information on coals that will be mined during the next 20-30 years. At the time this project was initiated, the publicly available USGS coal quality data was based on samples primarily collected and analyzed between 1973 and 1985. The primary objective of NaCQI was to create a database containing comprehensive, accurate and accessible chemical information on the quality of mined and prepared United States coals and their combustion byproducts. This objective was to be accomplished through maintaining the existing publicly available coal quality database, expanding the database through the acquisition of new samples from priority areas, and analysis of the samples using updated coal analytical chemistry procedures. Priorities for sampling include those areas where future sources of compliance coal are federally owned. This project was a cooperative effort between the U.S. Geological Survey (USGS), State geological surveys, universities, coal burning utilities, and the coal mining industry. Funding support came from the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Van Berkel, Gary J
2011-01-01
Analyte electrolysis using a repetitively pulsed high voltage ion source was investigated and compared to that using a regular, continuously operating direct current high voltage ion source in electrospray ionization mass spectrometry. The extent of analyte electrolysis was explored as a function of the length and frequency of the high voltage pulse using the model compound reserpine in positive ion mode. Using +5 kV as the maximum high voltage amplitude, reserpine was oxidized to its 2, 4, 6 and 8-electron oxidation products when direct current high voltage was employed. In contrast, when using a pulsed high voltage, oxidation of reserpinemore » was eliminated by employing the appropriate high voltage pulse length and frequency. This effect was caused by inefficient mass transport of the analyte to the electrode surface during the duration of the high voltage pulse and the subsequent relaxation of the emitter electrode/ electrolyte interface during the time period when the high voltage was turned off. This mode of ESI source operation allows for analyte electrolysis to be quickly and simply switched on or off electronically via a change in voltage pulse variables.« less
A new traffic model with a lane-changing viscosity term
NASA Astrophysics Data System (ADS)
Ko, Hung-Tang; Liu, Xiao-He; Guo, Ming-Min; Wu, Zheng
2015-09-01
In this paper, a new continuum traffic flow model is proposed, with a lane-changing source term in the continuity equation and a lane-changing viscosity term in the acceleration equation. Based on previous literature, the source term addresses the impact of speed difference and density difference between adjacent lanes, which provides better precision for free lane-changing simulation; the viscosity term turns lane-changing behavior to a “force” that may influence speed distribution. Using a flux-splitting scheme for the model discretization, two cases are investigated numerically. The case under a homogeneous initial condition shows that the numerical results by our model agree well with the analytical ones; the case with a small initial disturbance shows that our model can simulate the evolution of perturbation, including propagation, dissipation, cluster effect and stop-and-go phenomenon. Project supported by the National Natural Science Foundation of China (Grant Nos. 11002035 and 11372147) and Hui-Chun Chin and Tsung-Dao Lee Chinese Undergraduate Research Endowment (Grant No. CURE 14024).
DuVernet, Amy M; Dierdorff, Erich C; Wilson, Mark A
2015-09-01
Work analysis is fundamental to designing effective human resource systems. The current investigation extends previous research by identifying the differential effects of common design decisions, purposes, and organizational contexts on the data generated by work analyses. The effects of 19 distinct factors that span choices of descriptor, collection method, rating scale, and data source, as well as project purpose and organizational features, are explored. Meta-analytic results cumulated from 205 articles indicate that many of these variables hold significant consequences for work analysis data. Factors pertaining to descriptor choice, collection method, rating scale, and the purpose for conducting the work analysis each showed strong associations with work analysis data. The source of the work analysis information and organizational context in which it was conducted displayed fewer relationships. Findings can be used to inform choices work analysts make about methodology and postcollection evaluations of work analysis information. (c) 2015 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Winikka, C. C.; Schumann, H. H.
1975-01-01
Utilization of new sources of statewide remote sensing data, taken from high-altitude aircraft and from spacecraft is discussed along with incorporation of information extracted from these sources into on-going land and resources management programs in Arizona. Statewide cartographic applications of remote sensor data taken by NASA high-altitude aircraft include the development of a statewide semi-analytic control network, the production of nearly 1900 orthophotoquads (image maps) that are coincident in scale and area with the U.S. Geological Survey (USGS) 7. 5 minute topographic quadrangle map series, and satellite image maps of Arizona produced from LANDSAt multispectral scanner imagery. These cartographic products are utilized for a wide variety of experimental and operational earth resources applications. Applications of the imagery, image maps, and derived information discussed include: soils and geologic mapping projects, water resources investigations, land use inventories, environmental impact studies, highway route locations and mapping, vegetation cover mapping, wildlife habitat studies, power plant siting studies, statewide delineation of irrigation cropland, position determination of drilling sites, pictorial geographic bases for thematic mapping, and court exhibits.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.
Brown, Lawrence J
2015-10-01
This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.
Physics of cosmological cascades and observable properties
NASA Astrophysics Data System (ADS)
Fitoussi, T.; Belmont, R.; Malzac, J.; Marcowith, A.; Cohen-Tanugi, J.; Jean, P.
2017-04-01
TeV photons from extragalactic sources are absorbed in the intergalactic medium and initiate electromagnetic cascades. These cascades offer a unique tool to probe the properties of the universe at cosmological scales. We present a new Monte Carlo code dedicated to the physics of such cascades. This code has been tested against both published results and analytical approximations, and is made publicly available. Using this numerical tool, we investigate the main cascade properties (spectrum, halo extension and time delays), and study in detail their dependence on the physical parameters (extragalactic magnetic field, extragalactic background light, source redshift, source spectrum and beaming emission). The limitations of analytical solutions are emphasized. In particular, analytical approximations account only for the first generation of photons and higher branches of the cascade tree are neglected.
NHEXAS PHASE I REGION 5 STUDY--METALS IN DUST ANALYTICAL RESULTS
This data set includes analytical results for measurements of metals in 1,906 dust samples. Dust samples were collected to assess potential residential sources of dermal and inhalation exposures and to examine relationships between analyte levels in dust and in personal and bioma...
Controlling the spectral shape of nonlinear Thomson scattering with proper laser chirping
Rykovanov, S. G.; Geddes, C. G. R.; Schroeder, C. B.; ...
2016-03-18
Effects of nonlinearity in Thomson scattering of a high intensity laser pulse from electrons are analyzed. Analytic expressions for laser pulse shaping in frequency (chirping) are obtained which control spectrum broadening for high laser pulse intensities. These analytic solutions allow prediction of the spectral form and required laser parameters to avoid broadening. Results of analytical and numerical calculations agree well. The control over the scattered radiation bandwidth allows narrow bandwidth sources to be produced using high scattering intensities, which in turn greatly improves scattering yield for future x- and gamma-ray sources.
LOVE CANAL MONITORING PROGRAM. VOLUME 1
This report summarizes the prime contractor activities during the monitoring phase of the Love Canal project. Since GCA Corporation was only responsible for data collection, no analytical results appear in this report. The program involved a multifaceted sampling and analytical e...
Knowledge Transfer among Projects Using a Learn-Forget Model
ERIC Educational Resources Information Center
Tukel, Oya I.; Rom, Walter O.; Kremic, Tibor
2008-01-01
Purpose: The purpose of this paper is to analyze the impact of learning in a project-driven organization and demonstrate analytically how the learning, which takes place during the execution of successive projects, and the forgetting that takes place during the dormant time between the project executions, can impact performance and productivity in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Bhaduri, Budhendra L; Piburn, Jesse O
Geospatial intelligence has traditionally relied on the use of archived and unvarying data for planning and exploration purposes. In consequence, the tools and methods that are architected to provide insight and generate projections only rely on such datasets. Albeit, if this approach has proven effective in several cases, such as land use identification and route mapping, it has severely restricted the ability of researchers to inculcate current information in their work. This approach is inadequate in scenarios requiring real-time information to act and to adjust in ever changing dynamic environments, such as evacuation and rescue missions. In this work, wemore » propose PlanetSense, a platform for geospatial intelligence that is built to harness the existing power of archived data and add to that, the dynamics of real-time streams, seamlessly integrated with sophisticated data mining algorithms and analytics tools for generating operational intelligence on the fly. The platform has four main components i) GeoData Cloud a data architecture for storing and managing disparate datasets; ii) Mechanism to harvest real-time streaming data; iii) Data analytics framework; iv) Presentation and visualization through web interface and RESTful services. Using two case studies, we underpin the necessity of our platform in modeling ambient population and building occupancy at scale.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, B; Beller, H; Bartel, C M
This project was designed to investigate the important but virtually unstudied topic of the subsurface transport and fate of Endocrine Disrupting Compounds (EDCs) when treated wastewater is used for landscape irrigation (non-potable water reuse). Although potable water reuse was outside the scope of this project, the investigation clearly has relevance to such water recycling practices. The target compounds, which are discussed in the following section and include EDCs such as 4-nonylphenol (NP) and 17{beta}-estradiol, were studied not only because of their potential estrogenic effects on receptors but also because they can be useful as tracers of wastewater residue in groundwater.more » Since the compounds were expected to occur at very low (part per trillion) concentrations in groundwater, highly selective and sensitive analytical techniques had to be developed for their analysis. This project assessed the distributions of these compounds in wastewater effluents and groundwater, and examined their fate in laboratory soil columns simulating the infiltration of treated wastewater into an aquifer (e.g., as could occur during irrigation of a golf course or park with nonpotable treated water). Bioassays were used to determine the estrogenic activity present in effluents and groundwater, and the results were correlated with those from chemical analysis. In vitro assays for estrogenic activity were employed to provide an integrated measure of estrogenic potency of environmental samples without requiring knowledge or measurement of all bioactive compounds in the samples. For this project, the Las Positas Golf Course (LPGC) in the City of Livermore provided an ideal setting. Since 1978, irrigation of this area with treated wastewater has dominated the overall water budget. For a variety of reasons, a group of 10 monitoring wells were installed to evaluate wastewater impacts on the local groundwater. Additionally, these wells were regularly monitored for tritium ({sup 3}H). Overall volumes of irrigation water have been recorded along with total flows through the Livermore Water Reclamation Plant (LWRP). The Environmental Protection Department at LLNL has carefully monitored {sup 3}H effluent leaving the laboratory for many years. For two years preceding the initiation of this project, Grayson and Hudson, working with LWRP staff, had demonstrated that these data could be used to accurately calculate the {sup 3}H concentration in the applied irrigation water as a function of time. This was accomplished by performing two carefully monitored tritium releases from LLNL and following the {sup 3}H through the LWRP. Combining these data with our ability to age-date groundwater using the {sup 3}H-{sup 3}He age-dating technique, it was possible determine both the age and the degree of dilution from other water sources. This information was critical in the evaluation of observed concentrations of trace organic compounds from wastewater. The project included the following tasks: (1) Develop a conceptual model for Las Positas Golf Course (LPGC) irrigation that integrates existing meteorological, hydrologic, and environmental monitoring data. (2) Develop analytical methods (involving solid-phase extraction and isotope dilution LC/MS/MS) for the specific and sensitive measurement of target EDCs. (3) Develop a bioassay for estrogenic activity for application to effluent and groundwater samples. (4) Perform detailed hydrological evaluation of groundwater taken from LPGC. (5) Characterize the source term for target EDCs in wastewater. (6) Evaluate the utility of EDCs as source tracers for groundwater contamination.« less
Flat-plate solar array project. Volume 8: Project analysis and integration
NASA Technical Reports Server (NTRS)
Mcguire, P.; Henry, P.
1986-01-01
Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.
CERTS Microgrid Laboratory Test Bed - PIER Final Project Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eto, Joseph H.; Eto, Joseph H.; Lasseter, Robert
2008-07-25
The objective of the CERTS Microgrid Laboratory Test Bed project was to enhance the ease of integrating small energy sources into a microgrid. The project accomplished this objective by developing and demonstrating three advanced techniques, collectively referred to as the CERTS Microgrid concept, that significantly reduce the level of custom field engineering needed to operate microgrids consisting of small generating sources. The techniques comprising the CERTS Microgrid concept are: 1) a method for effecting automatic and seamless transitions between grid-connected and islanded modes of operation; 2) an approach to electrical protection within the microgrid that does not depend on highmore » fault currents; and 3) a method for microgrid control that achieves voltage and frequency stability under islanded conditions without requiring high-speed communications. The techniques were demonstrated at a full-scale test bed built near Columbus, Ohio and operated by American Electric Power. The testing fully confirmed earlier research that had been conducted initially through analytical simulations, then through laboratory emulations, and finally through factory acceptance testing of individual microgrid components. The islanding and resychronization method met all Institute of Electrical and Electronics Engineers 1547 and power quality requirements. The electrical protections system was able to distinguish between normal and faulted operation. The controls were found to be robust and under all conditions, including difficult motor starts. The results from these test are expected to lead to additional testing of enhancements to the basic techniques at the test bed to improve the business case for microgrid technologies, as well to field demonstrations involving microgrids that involve one or mroe of the CERTS Microgrid concepts.« less
Determinants of project success
NASA Technical Reports Server (NTRS)
Murphy, D. C.; Baker, B. N.; Fisher, D.
1974-01-01
The interactions of numerous project characteristics, with particular reference to project performance, were studied. Determinants of success are identified along with the accompanying implications for client organization, parent organization, project organization, and future research. Variables are selected which are found to have the greatest impact on project outcome, and the methodology and analytic techniques to be employed in identification of those variables are discussed.
New approaches for metabolomics by mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vertes, Akos
Small molecules constitute a large part of the world around us, including fossil and some renewable energy sources. Solar energy harvested by plants and bacteria is converted into energy rich small molecules on a massive scale. Some of the worst contaminants of the environment and compounds of interest for national security also fall in the category of small molecules. The development of large scale metabolomic analysis methods lags behind the state of the art established for genomics and proteomics. This is commonly attributed to the diversity of molecular classes included in a metabolome. Unlike nucleic acids and proteins, metabolites domore » not have standard building blocks, and, as a result, their molecular properties exhibit a wide spectrum. This impedes the development of dedicated separation and spectroscopic methods. Mass spectrometry (MS) is a strong contender in the quest for a quantitative analytical tool with extensive metabolite coverage. Although various MS-based techniques are emerging for metabolomics, many of these approaches include extensive sample preparation that make large scale studies resource intensive and slow. New ionization methods are redefining the range of analytical problems that can be solved using MS. This project developed new approaches for the direct analysis of small molecules in unprocessed samples, as well as pushed the limits of ultratrace analysis in volume limited complex samples. The projects resulted in techniques that enabled metabolomics investigations with enhanced molecular coverage, as well as the study of cellular response to stimuli on a single cell level. Effectively individual cells became reaction vessels, where we followed the response of a complex biological system to external perturbation. We established two new analytical platforms for the direct study of metabolic changes in cells and tissues following external perturbation. For this purpose we developed a novel technique, laser ablation electrospray ionization (LAESI), for metabolite profiling of functioning cells and tissues. The technique was based on microscopic sampling of biological specimens by mid-infrared laser ablation followed by electrospray ionization of the plume and MS analysis. The two main shortcomings of this technique had been limited specificity due to the lack of a separation step, and limited molecular coverage, especially for nonpolar chemical species. To improve specificity and the coverage of the metabolome, we implemented the LAESI ion source on a mass spectrometer with ion mobility separation (IMS). In this system, the gas phase ions produced by the LAESI source were first sorted according to their collisional cross sections in a mobility cell. These separated ion packets were then subjected to MS analysis. By combining the atmospheric pressure ionization with IMS, we improved the metabolite coverage. Further enhancement of the non-polar metabolite coverage resulted from the combination of laser ablation with vacuum UV irradiation of the ablation plume. Our results indicated that this new ionization modality provided improved detection for neutral and non-polar compounds. Based on rapid progress in photonics, we had introduced another novel ion source that utilized the interaction of a laser pulse with silicon nanopost arrays (NAPA). In these nanophotonic ion sources, the structural features were commensurate with the wavelength of the laser light. The enhanced interaction resulted in high ion yields. This ultrasensitive analytical platform enabled the MS analysis of single yeast cells. We extended these NAPA studies from yeast to other microorganisms, including green algae (Chlamydomonas reinhardtii) that captured energy from sunlight on a massive scale. Combining cellular perturbations, e.g., through environmental changes, with the newly developed single cell analysis methods enabled us to follow dynamic changes induced in the cells. In effect, we were able to use individual cells as a “laboratory,” and approached the long-standing goal of establishing a “lab-in-a-cell.” Model systems for these studies included cells of cyanobacteria (Anabaena), yeast (Saccharomyces cerevisiae), green algae (C. reinhardtii) and Arabidopsis thaliana.« less
NASA Astrophysics Data System (ADS)
Shan, Zhendong; Ling, Daosheng
2018-02-01
This article develops an analytical solution for the transient wave propagation of a cylindrical P-wave line source in a semi-infinite elastic solid with a fluid layer. The analytical solution is presented in a simple closed form in which each term represents a transient physical wave. The Scholte equation is derived, through which the Scholte wave velocity can be determined. The Scholte wave is the wave that propagates along the interface between the fluid and solid. To develop the analytical solution, the wave fields in the fluid and solid are defined, their analytical solutions in the Laplace domain are derived using the boundary and interface conditions, and the solutions are then decomposed into series form according to the power series expansion method. Each item of the series solution has a clear physical meaning and represents a transient wave path. Finally, by applying Cagniard's method and the convolution theorem, the analytical solutions are transformed into the time domain. Numerical examples are provided to illustrate some interesting features in the fluid layer, the interface and the semi-infinite solid. When the P-wave velocity in the fluid is higher than that in the solid, two head waves in the solid, one head wave in the fluid and a Scholte wave at the interface are observed for the cylindrical P-wave line source.
USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality
Ludtke, Amy S.; Woodworth, Mark T.
1997-01-01
The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
1995-01-01
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
NASA Astrophysics Data System (ADS)
Ramezani, Zeinab; Orouji, Ali A.
2017-08-01
This paper suggests and investigates a double-gate (DG) MOSFET, which emulates tunnel field effect transistors (M-TFET). We have combined this novel concept into a double-gate MOSFET, which behaves as a tunneling field effect transistor by work function engineering. In the proposed structure, in addition to the main gate, we utilize another gate over the source region with zero applied voltage and a proper work function to convert the source region from N+ to P+. We check the impact obtained by varying the source gate work function and source doping on the device parameters. The simulation results of the M-TFET indicate that it is a suitable case for a switching performance. Also, we present a two-dimensional analytic potential model of the proposed structure by solving the Poisson's equation in x and y directions and by derivatives from the potential profile; thus, the electric field is achieved. To validate our present model, we use the SILVACO ATLAS device simulator. The analytical results have been compared with it.
Diagnosis, referral, and rehabilitation within the Fairfax Alcohol Safety Action Project, 1974.
DOT National Transportation Integrated Search
1975-01-01
This report is a combination of Analytic Study #5 (Diagnosis and Referral) and Analytic Study #6 (Rehabilitation). Data concerning these countermeasures are presented together because of their very close relationship within the Fairfax ASAP. Both the...
Analytical Chemistry Laboratory
NASA Technical Reports Server (NTRS)
Anderson, Mark
2013-01-01
The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.
Project Summary. ANALYTICAL ELEMENT MODELING OF COASTAL AQUIFERS
Four topics were studied concerning the modeling of groundwater flow in coastal aquifers with analytic elements: (1) practical experience was obtained by constructing a groundwater model of the shallow aquifers below the Delmarva Peninsula USA using the commercial program MVAEM; ...
MEETING DATA QUALITY OBJECTIVES WITH INTERVAL INFORMATION
Immunoassay test kits are promising technologies for measuring analytes under field conditions. Frequently, these field-test kits report the analyte concentrations as falling in an interval between minimum and maximum values. Many project managers use field-test kits only for scr...
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
Advanced Small Modular Reactor Economics Status Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Thomas J.
2014-10-01
This report describes the data collection work performed for an advanced small modular reactor (AdvSMR) economics analysis activity at the Oak Ridge National Laboratory. The methodology development and analytical results are described in separate, stand-alone documents as listed in the references. The economics analysis effort for the AdvSMR program combines the technical and fuel cycle aspects of advanced (non-light water reactor [LWR]) reactors with the market and production aspects of SMRs. This requires the collection, analysis, and synthesis of multiple unrelated and potentially high-uncertainty data sets from a wide range of data sources. Further, the nature of both economic andmore » nuclear technology analysis requires at least a minor attempt at prediction and prognostication, and the far-term horizon for deployment of advanced nuclear systems introduces more uncertainty. Energy market uncertainty, especially the electricity market, is the result of the integration of commodity prices, demand fluctuation, and generation competition, as easily seen in deregulated markets. Depending on current or projected values for any of these factors, the economic attractiveness of any power plant construction project can change yearly or quarterly. For long-lead construction projects such as nuclear power plants, this uncertainty generates an implied and inherent risk for potential nuclear power plant owners and operators. The uncertainty in nuclear reactor and fuel cycle costs is in some respects better understood and quantified than the energy market uncertainty. The LWR-based fuel cycle has a long commercial history to use as its basis for cost estimation, and the current activities in LWR construction provide a reliable baseline for estimates for similar efforts. However, for advanced systems, the estimates and their associated uncertainties are based on forward-looking assumptions for performance after the system has been built and has achieved commercial operation. Advanced fuel materials and fabrication costs have large uncertainties based on complexities of operation, such as contact-handled fuel fabrication versus remote handling, or commodity availability. Thus, this analytical work makes a good faith effort to quantify uncertainties and provide qualifiers, caveats, and explanations for the sources of these uncertainties. The overall result is that this work assembles the necessary information and establishes the foundation for future analyses using more precise data as nuclear technology advances.« less
An ion source for radiofrequency-pulsed glow discharge time-of-flight mass spectrometry
NASA Astrophysics Data System (ADS)
González Gago, C.; Lobo, L.; Pisonero, J.; Bordel, N.; Pereiro, R.; Sanz-Medel, A.
2012-10-01
A Grimm-type glow discharge (GD) has been designed and constructed as an ion source for pulsed radiofrequency GD spectrometry when coupled to an orthogonal time of flight mass spectrometer. Pulse shapes of argon species and analytes were studied as a function of the discharge conditions using a new in-house ion source (UNIOVI GD) and results have been compared with a previous design (PROTOTYPE GD). Different behavior and shapes of the pulse profiles have been observed for the two sources evaluated, particularly for the plasma gas ionic species detected. In the more analytically relevant region (afterglow), signals for 40Ar+ with this new design were negligible, while maximum intensity was reached earlier in time for 41(ArH)+ than when using the PROTOTYPE GD. Moreover, while maximum 40Ar+ signals measured along the pulse period were similar in both sources, 41(ArH)+ and 80(Ar2)+ signals tend to be noticeable higher using the PROTOTYPE chamber. The UNIOVI GD design was shown to be adequate for sensitive direct analysis of solid samples, offering linear calibration graphs and good crater shapes. Limits of detection (LODs) are in the same order of magnitude for both sources, although the UNIOVI source provides slightly better LODs for those analytes with masses slightly higher than 41(ArH)+.
Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.
2011-01-01
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097
Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M
2011-07-15
The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.
Battaglia, Maurizio; Hill, D.P.
2009-01-01
Joint measurements of ground deformation and micro-gravity changes are an indispensable component for any volcano monitoring strategy. A number of analytical mathematical models are available in the literature that can be used to fit geodetic data and infer source location, depth and density. Bootstrap statistical methods allow estimations of the range of the inferred parameters. Although analytical models often assume that the crust is elastic, homogenous and isotropic, they can take into account different source geometries, the influence of topography, and gravity background noise. The careful use of analytical models, together with high quality data sets, can produce valuable insights into the nature of the deformation/gravity source. Here we present a review of various modeling methods, and use the historical unrest at Long Valley caldera (California) from 1982 to 1999 to illustrate the practical application of analytical modeling and bootstrap to constrain the source of unrest. A key question is whether the unrest at Long Valley since the late 1970s can be explained without calling upon an intrusion of magma. The answer, apparently, is no. Our modeling indicates that the inflation source is a slightly tilted prolate ellipsoid (dip angle between 91?? and 105??) at a depth of 6.5 to 7.9??km beneath the caldera resurgent dome with an aspect ratio between 0.44 and 0.60, a volume change from 0.161 to 0.173??km3 and a density of 1241 to 2093??kg/m3. The larger uncertainty of the density estimate reflects the higher noise of gravity measurements. These results are consistent with the intrusion of silicic magma with a significant amount of volatiles beneath the caldera resurgent dome. ?? 2008 Elsevier B.V.
Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.
2011-01-01
This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.
Analysis of renewable energy projects' implementation in Russia
NASA Astrophysics Data System (ADS)
Ratner, S. V.; Nizhegorodtsev, R. M.
2017-06-01
With the enactment in 2013 of a renewable energy scheme by contracting qualified power generation facilities working on renewable energy sources (RES), the process of construction and connection of such facilities to the Federal Grid Company has intensified in Russia. In 2013-2015, 93 projects of solar, wind, and small hydropower energy were selected on the basis of competitive bidding in the country with the purpose of subsequent support. Despite some technical and organizational problems and a time delay of some RES projects, in 2014-2015 five solar generating facilities with total capacity of 50 MW were commissioned, including 30 MW in Orenburg oblast. However, the proportion of successful projects is low and amounts to approximately 30% of the total number of announced projects. The purpose of this paper is to analyze the experience of implementation of renewable energy projects that passed through a competitive selection and gained the right to get a partial compensation for the construction and commissioning costs of RES generating facilities in the electric power wholesale market zone. The informational background for the study is corporate reports of project promoters, analytical and information materials of the Association NP Market Council, and legal documents for the development of renewable energy. The methodological base of the study is a theory of learning curves that assumes that cost savings in the production of high-tech products depends on the production growth rate (economy of scale) and gaining manufacturing experience (learning by doing). The study has identified factors that have a positive and a negative impact on the implementation of RES projects. Improvement of promotion measures in the renewable energy development in Russia corresponding to the current socio-economic situation is proposed.
Analysis and testing of a bridge deck reinforced with GFRP rebars : final report, April 3, 2007.
DOT National Transportation Integrated Search
2007-04-03
The present project had two main objectives, to experimentally and analytically investigate a bridge deck reinforced with glass : fiber reinforced polymer rebars, and to perform durability tests on four rebar types. : An analytical investigation was ...
A Study of Clinically Related Open Source Software Projects
Hogarth, Michael A.; Turner, Stuart
2005-01-01
Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056
Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray
It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.
Moskovets, Eugene; Misharin, Alexander; Laiko, Viktor; Doroshenko, Vladimir
2016-07-15
A comparative MS study was conducted on the analytical performance of two matrix-assisted laser desorption/ionization (MALDI) sources that operated at either low pressure (∼1Torr) or at atmospheric pressure. In both cases, the MALDI sources were attached to a linear ion trap mass spectrometer equipped with a two-stage ion funnel. The obtained results indicate that the limits of detection, in the analysis of identical peptide samples, were much lower with the source that was operated slightly below the 1-Torr pressure. In the low-pressure (LP) MALDI source, ion signals were observed at a laser fluence that was considerably lower than the one determining the appearance of ion signals in the atmospheric pressure (AP) MALDI source. When the near-threshold laser fluences were used to record MALDI MS spectra at 1-Torr and 750-Torr pressures, the level of chemical noise at the 1-Torr pressure was much lower compared to that at AP. The dependency of the analyte ion signals on the accelerating field which dragged the ions from the MALDI plate to the MS analyzer are presented for the LP and AP MALDI sources. The study indicates that the laser fluence, background gas pressure, and field accelerating the ions away from a MALDI plate were the main parameters which determined the ion yield, signal-to-noise (S/N) ratios, the fragmentation of the analyte ions, and adduct formation in the LP and AP MALDI MS methods. The presented results can be helpful for a deeper insight into the mechanisms responsible for the ion formation in MALDI. Copyright © 2016 Elsevier Inc. All rights reserved.
Preliminary tsunami hazard assessment in British Columbia, Canada
NASA Astrophysics Data System (ADS)
Insua, T. L.; Grilli, A. R.; Grilli, S. T.; Shelby, M. R.; Wang, K.; Gao, D.; Cherniawsky, J. Y.; Harris, J. C.; Heesemann, M.; McLean, S.; Moran, K.
2015-12-01
Ocean Networks Canada (ONC), a not-for-profit initiative by the University of Victoria that operates several cabled ocean observatories, is developing a new generation of ocean observing systems (referred to as Smart Ocean Systems™), involving advanced undersea observation technologies, data networks and analytics. The ONC Tsunami project is a Smart Ocean Systems™ project that addresses the need for a near-field tsunami detection system for the coastal areas of British Columbia. Recent studies indicate that there is a 40-80% probability over the next 50 for a significant tsunami impacting the British Columbia (BC) coast with runups higher than 1.5 m. The NEPTUNE cabled ocean observatory, operated by ONC off of the west coast of British Columbia, could be used to detect near-field tsunami events with existing instrumentation, including seismometers and bottom pressure recorders. As part of this project, new tsunami simulations are underway for the BC coast. Tsunami propagation is being simulated with the FUNWAVE-TVD model, for a suite of new source models representing Cascadia megathrust rupture scenarios. Simulations are performed by one-way coupling in a series of nested model grids (from the source to the BC coast), whose bathymetry was developed based on digital elevation maps (DEMs) of the area, to estimate both tsunami arrival time and coastal runup/inundation for different locations. Besides inundation, maps of additional parameters such as maximum current are being developed, that will aid in tsunami hazard assessment and risk mitigation, as well as developing evacuation plans. We will present initial results of this work for the Port Alberni inlet, in particular Ucluelet, based on new source models developed using the best available data. We will also present a model validation using measurements of the 2011 transpacific Tohoku-oki tsunami recorded in coastal BC by several instruments from various US and Canadian agencies.
Fleet management performance monitoring.
DOT National Transportation Integrated Search
2013-05-01
The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...
Tennessee long-range transportation plan : project evaluation system
DOT National Transportation Integrated Search
2005-12-01
The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...
Reconstruction of sound source signal by analytical passive TR in the environment with airflow
NASA Astrophysics Data System (ADS)
Wei, Long; Li, Min; Yang, Debin; Niu, Feng; Zeng, Wu
2017-03-01
In the acoustic design of air vehicles, the time-domain signals of noise sources on the surface of air vehicles can serve as data support to reveal the noise source generation mechanism, analyze acoustic fatigue, and take measures for noise insulation and reduction. To rapidly reconstruct the time-domain sound source signals in an environment with flow, a method combining the analytical passive time reversal mirror (AP-TR) with a shear flow correction is proposed. In this method, the negative influence of flow on sound wave propagation is suppressed by the shear flow correction, obtaining the corrected acoustic propagation time delay and path. Those corrected time delay and path together with the microphone array signals are then submitted to the AP-TR, reconstructing more accurate sound source signals in the environment with airflow. As an analytical method, AP-TR offers a supplementary way in 3D space to reconstruct the signal of sound source in the environment with airflow instead of the numerical TR. Experiments on the reconstruction of the sound source signals of a pair of loud speakers are conducted in an anechoic wind tunnel with subsonic airflow to validate the effectiveness and priorities of the proposed method. Moreover the comparison by theorem and experiment result between the AP-TR and the time-domain beamforming in reconstructing the sound source signal is also discussed.
Designing a Marketing Analytics Course for the Digital Age
ERIC Educational Resources Information Center
Liu, Xia; Burns, Alvin C.
2018-01-01
Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…
Introducing Text Analytics as a Graduate Business School Course
ERIC Educational Resources Information Center
Edgington, Theresa M.
2011-01-01
Text analytics refers to the process of analyzing unstructured data from documented sources, including open-ended surveys, blogs, and other types of web dialog. Text analytics has enveloped the concept of text mining, an analysis approach influenced heavily from data mining. While text mining has been covered extensively in various computer…
Shelley, Jacob T; Hieftje, Gary M
2010-04-01
The recent development of ambient desorption/ionization mass spectrometry (ADI-MS) has enabled fast, simple analysis of many different sample types. The ADI-MS sources have numerous advantages, including little or no required sample pre-treatment, simple mass spectra, and direct analysis of solids and liquids. However, problems of competitive ionization and limited fragmentation require sample-constituent separation, high mass accuracy, and/or tandem mass spectrometry (MS/MS) to detect, identify, and quantify unknown analytes. To maintain the inherent high throughput of ADI-MS, it is essential for the ion source/mass analyzer combination to measure fast transient signals and provide structural information. In the current study, the flowing atmospheric-pressure afterglow (FAPA) ionization source is coupled with a time-of-flight mass spectrometer (TOF-MS) to analyze fast transient signals (<500 ms FWHM). It was found that gas chromatography (GC) coupled with the FAPA source resulted in a reproducible (<5% RSD) and sensitive (detection limits of <6 fmol for a mixture of herbicides) system with analysis times of ca. 5 min. Introducing analytes to the FAPA in a transient was also shown to significantly reduce matrix effects caused by competitive ionization by minimizing the number and amount of constituents introduced into the ionization source. Additionally, MS/MS with FAPA-TOF-MS, enabling analyte identification, was performed via first-stage collision-induced dissociation (CID). Lastly, molecular and structural information was obtained across a fast transient peak by modulating the conditions that caused the first-stage CID.
Fast analytical scatter estimation using graphics processing units.
Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris
2015-01-01
To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.
NASA Astrophysics Data System (ADS)
Sakurai, Kazuo; Takahara, Atsushi
2011-01-01
This special issue contains peer-reviewed invited and contributed papers that were presented at The International Symposium on 'Future Trend in Soft Material Research with Advanced Light Source: Interdisciplinary of Bio- & Synthetic- Materials and Industrial Transferring', which was held in SPring-8, Japan, on September 1-3, 2010. Advanced light sources including neutron and synchrotron are becoming increasingly critical to the study of soft materials. This cutting-edge analytical tool is expected to lead to the creation of new materials with revolutionary properties and functions. At SPring-8, a new beam line dedicated to soft materials has now been launched as one of the most powerful X-rays for scattering and diffraction. Additionally, the next-generation light source, XFEL (X-ray Free Electron Laser), facilities are currently being developed in several locations. In the near future, femto-second and coherent X-ray sources will be available in soft material research and should reveal the various new aspects of advanced soft material research and technology. On the occasion of the third fiscal year of the CREST (project leader: Kazuo Sakurai) and ERATO (project leader: Atsushi Takahara) projects, we organized this international symposium in order to accelerate the discussion among global-level researchers working on next-generation synchrotron radiation science, biophysics and supramolecular science, modern surface science in soft materials, and industrial applications of neutron and synchrotron radiation sources. In this symposium 21 oral presentations, including 8 invited speakers from abroad, and 40 poster presentations from USA, France, Korea, Taiwan, and Japan were presented during the three day symposium. The symposium chairs reviewed the poster presentations by young scientists, and eight young researchers received the Award for Best Poster Presentation. We sincerely hope that these proceedings will be beneficial in future applications of advanced light sources to soft materials science and technology, not only to our ERATO and CREST projects, but also to the research of all the participants, broadening our scientific horizons. Kazuo Sakurai & Atsushi TakaharaSymposium Chairs Symposium Organization and Committee Supported by: Japan Science and Technology Agency (JST) Japan Synchrotron Radiation Research Institute (JASRI) Co-sponsored by: Society of Japan Polymer Science Japanese Society of Synchrotron Radiation Research Advanced Softmaterial Beamline Consortium Symposium Chairs: Atsushi Takahara (Kyushu University, JST, ERATO) Kazuo Sakurai (Univ. Kitakyushu, JST, CREST) Organizing Committee: Yoshiyuki Amemiya (The Univ. of Tokyo, JST, CREST) Naoto Yagi (JASRI, JST, CREST) Masaki Takata (JASRI) Isamu Akiba (Univ. Kitakyushu, JST, CREST) Yuya Shinohara (The Univ. of Tokyo, JST, CREST) Taiki Hoshino (Kyushu University, JST, ERATO) Jun-ichi Imuta (Kyushu University, JST, ERATO) Moriya Kikuchi (Kyushu University, JST, ERATO) Motoyasu Kobayashi (Kyushu University, JST, ERATO) Group photograph Group photograph Lecture meeting Lecture meeting
Get to Know Your Neighborhood Pest: An Interdisciplinary Project for Middle School Students.
ERIC Educational Resources Information Center
Zipko, Stephen J.
1982-01-01
Describes an interdisciplinary, month-long minicourse project focusing on the gypsy moth. The project provided students with opportunities to develop analytical and problem-solving skills while studying content from entomology, botany, chemistry, toxicology, ecology, math, art, law, political science, history, English, consumer studies, and…
Intelligent Vehicle Mobility M&S Capability Development (FY13 innovation Project) (Briefing Charts)
2014-05-19
Intelligent Vehicle Mobility M&S Capability Development (FY13 Innovation Project) P. Jayakumar and J. Raymond, Analytics 19 May 2014...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paramsithy Jayakumar ; J Raymond 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING
Concurrence of big data analytics and healthcare: A systematic review.
Mehta, Nishita; Pandit, Anil
2018-06-01
The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less
Analytical Sociology: A Bungean Appreciation
ERIC Educational Resources Information Center
Wan, Poe Yu-ze
2012-01-01
Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve…
Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory
ERIC Educational Resources Information Center
Jensen, Mark B.
2015-01-01
An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…
NASA Technical Reports Server (NTRS)
Walker, A. B. C., Jr.
1975-01-01
Techniques for the study of the solar corona are reviewed as an introduction to a discussion of modifications required for the study of cosmic sources. Spectroscopic analysis of individual sources and the interstellar medium is considered. The latter was studied via analysis of its effect on the spectra of selected individual sources. The effects of various characteristics of the ISM, including the presence of grains, molecules, and ionization, are first discussed, and the development of ISM models is described. The expected spectral structure of individual cosmic sources is then reviewed with emphasis on supernovae remnants and binary X-ray sources. The observational and analytical requirements imposed by the characteristics of these sources are identified, and prospects for the analysis of abundances and the study of physical parameters within them are assessed. Prospects for the spectroscopic study of other classes of X-ray sources are also discussed.
Review and assessment of the HOST turbine heat transfer program
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
Analytic solution of magnetic induction distribution of ideal hollow spherical field sources
NASA Astrophysics Data System (ADS)
Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min
2017-12-01
The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.
Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer
NASA Astrophysics Data System (ADS)
Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien
2016-04-01
Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on https://github.com/planetserver References: Baumann, P., et al. (2015) Big Data Analytics for Earth Sciences: the EarthServer approach, International Journal of Digital Earth, doi: 10.1080/17538947.2014.1003106. Cantini, F. et al. (2014) Geophys. Res. Abs., Vol. 16, #EGU2014-3784. Gaddis, L., and T. Hare (2015), Status of tools and data for planetary research, Eos, 96, dos: 10.1029/2015EO041125. Hogan, P., 2011. NASA World Wind: Infrastructure for Spatial Data. Technical report. Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. Oosthoek, J.H.P, et al. (2013) Advances in Space Research. doi: 10.1016/j.asr.2013.07.002. Rossi, A. P., et al. (2014) PlanetServer/EarthServer: Big Data analytics in Planetary Science. Geophysical Research Abstracts, Vol. 16, #EGU2014-5149.
NASA Astrophysics Data System (ADS)
Rinzema, Kees; ten Bosch, Jaap J.; Ferwerda, Hedzer A.; Hoenders, Bernhard J.
1995-01-01
The diffusion approximation, which is often used to describe the propagation of light in biological tissues, is only good at a sufficient distance from sources and boundaries. Light- tissue interaction is however most intense in the region close to the source. It would therefore be interesting to study this region more closely. Although scattering in biological tissues is predominantly forward peaked, explicit solutions to the transport equation have only been obtained in the case of isotropic scattering. Particularly, for the case of an isotropic point source in an unbounded, isotropically scattering medium the solution is well known. We show that this problem can also be solved analytically if the scattering is no longer isotropic, while everything else remains the same.
Newspaper Reading among College Students in Development of Their Analytical Ability
ERIC Educational Resources Information Center
Kumar, Dinesh
2009-01-01
The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…
Early Alert of Academically At-Risk Students: An Open Source Analytics Initiative
ERIC Educational Resources Information Center
Jayaprakash, Sandeep M.; Moody, Erik W.; Lauría, Eitel J. M.; Regan, James R.; Baron, Joshua D.
2014-01-01
The Open Academic Analytics Initiative (OAAI) is a collaborative, multi-year grant program aimed at researching issues related to the scaling up of learning analytics technologies and solutions across all of higher education. The paper describes the goals and objectives of the OAAI, depicts the process and challenges of collecting, organizing and…
Towards an Analytic Foundation for Network Architecture
2010-12-31
SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.
CEDS Addresses: Rubric Elements
ERIC Educational Resources Information Center
US Department of Education, 2015
2015-01-01
Common Education Data Standards (CEDS) Version 4 introduced a common data vocabulary for defining rubrics in a data system. The CEDS elements support digital representations of both holistic and analytic rubrics. This document shares examples of holistic and analytic project rubrics, available CEDS Connections, and a logical model showing the…
The NIST Quantitative Infrared Database
Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.
1999-01-01
With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.
Hartzband, David; Jacobs, Feygele
2016-01-01
As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Data awareness, that is, an appreciation of the importance of data integrity, data hygiene 2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned
Hartzband, David; Jacobs, Feygele
2016-01-01
Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Conclusion Data awareness, that is, an appreciation of the importance of data integrity, data hygiene2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable. PMID:28210424
Near real-time vaccine safety surveillance with partially accrued data.
Greene, Sharon K; Kulldorff, Martin; Yin, Ruihua; Yih, W Katherine; Lieu, Tracy A; Weintraub, Eric S; Lee, Grace M
2011-06-01
The Vaccine Safety Datalink (VSD) Project conducts near real-time vaccine safety surveillance using sequential analytic methods. Timely surveillance is critical in identifying potential safety problems and preventing additional exposure before most vaccines are administered. For vaccines that are administered during a short period, such as influenza vaccines, timeliness can be improved by undertaking analyses while risk windows following vaccination are ongoing and by accommodating predictable and unpredictable data accrual delays. We describe practical solutions to these challenges, which were adopted by the VSD Project during pandemic and seasonal influenza vaccine safety surveillance in 2009/2010. Adjustments were made to two sequential analytic approaches. The Poisson-based approach compared the number of pre-defined adverse events observed following vaccination with the number expected using historical data. The expected number was adjusted for the proportion of the risk window elapsed and the proportion of inpatient data estimated to have accrued. The binomial-based approach used a self-controlled design, comparing the observed numbers of events in risk versus comparison windows. Events were included in analysis only if they occurred during a week that had already passed for both windows. Analyzing data before risk windows fully elapsed improved the timeliness of safety surveillance. Adjustments for data accrual lags were tailored to each data source and avoided biasing analyses away from detecting a potential safety problem, particularly early during surveillance. The timeliness of vaccine and drug safety surveillance can be improved by properly accounting for partially elapsed windows and data accrual delays. Copyright © 2011 John Wiley & Sons, Ltd.
Adaptive learning in complex reproducing kernel Hilbert spaces employing Wirtinger's subgradients.
Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios
2012-03-01
This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A.A. Bingham; R.M. Ferrer; A.M. ougouag
2009-09-01
An accurate and computationally efficient two or three-dimensional neutron diffusion model will be necessary for the development, safety parameters computation, and fuel cycle analysis of a prismatic Very High Temperature Reactor (VHTR) design under Next Generation Nuclear Plant Project (NGNP). For this purpose, an analytical nodal Green’s function solution for the transverse integrated neutron diffusion equation is developed in two and three-dimensional hexagonal geometry. This scheme is incorporated into HEXPEDITE, a code first developed by Fitzpatrick and Ougouag. HEXPEDITE neglects non-physical discontinuity terms that arise in the transverse leakage due to the transverse integration procedure application to hexagonal geometry andmore » cannot account for the effects of burnable poisons across nodal boundaries. The test code being developed for this document accounts for these terms by maintaining an inventory of neutrons by using the nodal balance equation as a constraint of the neutron flux equation. The method developed in this report is intended to restore neutron conservation and increase the accuracy of the code by adding these terms to the transverse integrated flux solution and applying the nodal Green’s function solution to the resulting equation to derive a semi-analytical solution.« less
Earth science big data at users' fingertips: the EarthServer Science Gateway Mobile
NASA Astrophysics Data System (ADS)
Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Fargetta, Marco; Pappalardo, Marco; Rundo, Francesco
2014-05-01
The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. In this contribution we will report on the EarthServer Science Gateway Mobile, an app for both iOS and Android-based devices that allows users to seamlessly access some of the EarthServer applications using SAML-based federated authentication and fine-grained authorisation mechanisms.
Drewes, J E; Anderson, P; Denslow, N; Olivieri, A; Schlenk, D; Snyder, S A; Maruya, K A
2013-01-01
This study discussed a proposed process to prioritize chemicals for reclaimed water monitoring programs, selection of analytical methods required for their quantification, toxicological relevance of chemicals of emerging concern regarding human health, and related issues. Given that thousands of chemicals are potentially present in reclaimed water and that information about those chemicals is rapidly evolving, a transparent, science-based framework was developed to guide prioritization of which compounds of emerging concern (CECs) should be included in reclaimed water monitoring programs. The recommended framework includes four steps: (1) compile environmental concentrations (e.g., measured environmental concentration or MEC) of CECs in the source water for reuse projects; (2) develop a monitoring trigger level (MTL) for each of these compounds (or groups thereof) based on toxicological relevance; (3) compare the environmental concentration (e.g., MEC) to the MTL; CECs with a MEC/MTL ratio greater than 1 should be prioritized for monitoring, compounds with a ratio less than '1' should only be considered if they represent viable treatment process performance indicators; and (4) screen the priority list to ensure that a commercially available robust analytical method is available for that compound.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nashold, B.; Rosenblatt, D.; Hau, J.
1995-08-01
This summary describes a Supplemental Site Inspection (SSI) conducted by Argonne National Laboratory (ANL) at Air Force Plant 59 (AFP 59) in Johnson City, New York. All required data pertaining to this project were entered by ANL into the Air Force-wide Installation Restoration Program Information System (IRPIMS) computer format and submitted to an appropriate authority. The work was sponsored by the United States Air Force as part of its Installation Restoration Program (IRP). Previous studies had revealed the presence of contaminants at the site and identified several potential contaminant sources. Argonne`s study was conducted to answer questions raised by earliermore » investigations. This volume consists of appendices F-Q, which contain the analytical data from the site characterization.« less
A development of logistics management models for the Space Transportation System
NASA Technical Reports Server (NTRS)
Carrillo, M. J.; Jacobsen, S. E.; Abell, J. B.; Lippiatt, T. F.
1983-01-01
A new analytic queueing approach was described which relates stockage levels, repair level decisions, and the project network schedule of prelaunch operations directly to the probability distribution of the space transportation system launch delay. Finite source population and limited repair capability were additional factors included in this logistics management model developed specifically for STS maintenance requirements. Data presently available to support logistics decisions were based on a comparability study of heavy aircraft components. A two-phase program is recommended by which NASA would implement an integrated data collection system, assemble logistics data from previous STS flights, revise extant logistics planning and resource requirement parameters using Bayes-Lin techniques, and adjust for uncertainty surrounding logistics systems performance parameters. The implementation of these recommendations can be expected to deliver more cost-effective logistics support.
Multimedia Analysis plus Visual Analytics = Multimedia Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinchor, Nancy; Thomas, James J.; Wong, Pak C.
2010-10-01
Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
NASA Astrophysics Data System (ADS)
Sarmah, Ratan; Tiwari, Shubham
2018-03-01
An analytical solution is developed for predicting two-dimensional transient seepage into ditch drainage network receiving water from a non-uniform steady ponding field from the surface of the soil under the influence of source/sink in the flow domain. The flow domain is assumed to be saturated, homogeneous and anisotropic in nature and have finite extends in horizontal and vertical directions. The drains are assumed to be standing vertical and penetrating up to impervious layer. The water levels in the drains are unequal and invariant with time. The flow field is also assumed to be under the continuous influence of time-space dependent arbitrary source/sink term. The correctness of the proposed model is checked by developing a numerical code and also with the existing analytical solution for the simplified case. The study highlights the significance of source/sink influence in the subsurface flow. With the imposition of the source and sink term in the flow domain, the pathline and travel time of water particles started deviating from their original position and above that the side and top discharge to the drains were also observed to have a strong influence of the source/sink terms. The travel time and pathline of water particles are also observed to have a dependency on the height of water in the ditches and on the location of source/sink activation area.
NASA Astrophysics Data System (ADS)
Armigliato, A.
2008-07-01
In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.
Human Capital Analytics to Manage the Army Officer Population
2017-06-09
employees from spending time and energy on a career path projected to be obsolete. Instead, managers are able to use data to show employees where they...HUMAN CAPITAL ANALYTICS TO MANAGE THE ARMY OFFICER POPULATION A thesis presented to the Faculty of the U.S. Army Command and...From - To) AUG 2016 – JUNE 2017 4. TITLE AND SUBTITLE Human Capital Analytics to Manage the Army Officer Population 5a. CONTRACT NUMBER 5b
An Analytical Method for Measuring Competence in Project Management
ERIC Educational Resources Information Center
González-Marcos, Ana; Alba-Elías, Fernando; Ordieres-Meré, Joaquín
2016-01-01
The goal of this paper is to present a competence assessment method in project management that is based on participants' performance and value creation. It seeks to close an existing gap in competence assessment in higher education. The proposed method relies on information and communication technology (ICT) tools and combines Project Management…
Sex Differences in Objective and Projective Dependency Tests: A Meta-Analytic Review.
ERIC Educational Resources Information Center
Bornstein, Robert F.
1995-01-01
A meta-analysis of 97 studies published since 1950 that assessed sex differences in scores on objective and projective dependency tests indicated that women consistently obtained higher dependency scores on objective tests, and men obtained higher scores on projective tests. Findings are discussed in terms of sex role socialization. (SLD)
Chilled to the bone: embodied countertransference and unspoken traumatic memories.
Zoppi, Luisa
2017-11-01
Starting from a deeply challenging experience of early embodied countertransference in a first encounter with a new patient, the author explores the issues it raised. Such moments highlight projective identification as well as what Stone (2006) has described as 'embodied resonance in the countertransference'. In these powerful experiences linear time and subject boundaries are altered, and this leads to central questions about analytic work. As well as discussing the uncanny experience at the very beginning of an analytic encounter and its challenges for the analytic field, the author considers 'the time horizon of analytic process' (Hogenson ), the relationship between 'moments of complexity and analytic boundaries' (Cambray ) and the role of mirror neurons in intersubjective experience. © 2017, The Society of Analytical Psychology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, J.N.; Holderness, J.H.; James, D.W.
1992-12-01
Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less
Fingering instabilities in bacterial community phototaxis
NASA Astrophysics Data System (ADS)
Vps, Ritwika; Man Wah Chau, Rosanna; Casey Huang, Kerwyn; Gopinathan, Ajay
Synechocystis sp PCC 6803 is a phototactic cyanobacterium that moves directionally in response to a light source. During phototaxis, these bacterial communities show emergent spatial organisation resulting in the formation of finger-like projections at the propagating front. In this study, we propose an analytical model that elucidates the underlying physical mechanisms which give rise to these spatial patterns. We describe the migrating front during phototaxis as a one-dimensional curve by considering the effects of phototactic bias, diffusion and surface tension. By considering the propagating front as composed of perturbations to a flat solution and using linear stability analysis, we predict a critical bias above which the finger-like projections appear as instabilities. We also predict the wavelengths of the fastest growing mode and the critical mode above which the instabilities disappear. We validate our predictions through comparisons to experimental data obtained by analysing images of phototaxis in Synechocystis communities. Our model also predicts the observed loss of instabilities in taxd1 mutants (cells with inactive TaxD1, an important photoreceptor in finger formation), by considering diffusion in mutually perpendicular directions and a lower, negative bias.
NASA Astrophysics Data System (ADS)
Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.
2013-12-01
There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.
Biomonitoring of air pollution as exemplified by recent IAEA programs.
Smodis, B; Parr, R M
1999-01-01
Biomonitoring is an appropriate tool for assessing the levels of atmospheric pollution, having several advantages compared with the use of direct measurements of contaminants (e.g., in airborne particulate matter, atmospheric deposition, precipitation), related primarily to the permanent and common occurrence of the chosen organisms in the field, the ease of sampling, and trace element accumulation. Furthermore, biomonitors may provide a measure of integrated exposure over an extended period of time and are present in remote areas and no expensive technical equipment is involved in collecting them. They accumulate contaminants over the exposure time and concentrate them, thus facilitating analytical measurements. Based on large-scale biomonitoring surveys, polluted areas can be identified, and by applying appropriate statistical tools, information can be obtained on the type of pollution sources and on the transboundary transport of atmospheric pollutants. The International Atomic Energy Agency is including the research on biomonitors in its projects on health-related environmental studies. Biomonitoring activities from several coordinated research projects on air pollution are presented, and results from an international workshop are discussed. In addition, activities in supporting improvement quality in the participating laboratories are outlined.
Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula
2017-02-01
Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.
Climate Data Analytics Workflow Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.
2016-12-01
In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.
Battaglia, Maurizio; Gottsmann, J.; Carbone, D.; Fernandez, J.
2008-01-01
Time-dependent gravimetric measurements can detect subsurface processes long before magma flow leads to earthquakes or other eruption precursors. The ability of gravity measurements to detect subsurface mass flow is greatly enhanced if gravity measurements are analyzed and modeled with ground-deformation data. Obtaining the maximum information from microgravity studies requires careful evaluation of the layout of network benchmarks, the gravity environmental signal, and the coupling between gravity changes and crustal deformation. When changes in the system under study are fast (hours to weeks), as in hydrothermal systems and restless volcanoes, continuous gravity observations at selected sites can help to capture many details of the dynamics of the intrusive sources. Despite the instrumental effects, mainly caused by atmospheric temperature, results from monitoring at Mt. Etna volcano show that continuous measurements are a powerful tool for monitoring and studying volcanoes.Several analytical and numerical mathematical models can beused to fit gravity and deformation data. Analytical models offer a closed-form description of the volcanic source. In principle, this allows one to readily infer the relative importance of the source parameters. In active volcanic sites such as Long Valley caldera (California, U.S.A.) and Campi Flegrei (Italy), careful use of analytical models and high-quality data sets has produced good results. However, the simplifications that make analytical models tractable might result in misleading volcanological inter-pretations, particularly when the real crust surrounding the source is far from the homogeneous/ isotropic assumption. Using numerical models allows consideration of more realistic descriptions of the sources and of the crust where they are located (e.g., vertical and lateral mechanical discontinuities, complex source geometries, and topography). Applications at Teide volcano (Tenerife) and Campi Flegrei demonstrate the importance of this more realistic description in gravity calculations. ?? 2008 Society of Exploration Geophysicists. All rights reserved.
A Field Study Program in Analytical Chemistry for College Seniors.
ERIC Educational Resources Information Center
Langhus, D. L.; Flinchbaugh, D. A.
1986-01-01
Describes an elective field study program at Moravian College (Pennsylvania) in which seniors in analytical chemistry obtain first-hand experience at Bethlehem Steel Corporation. Discusses the program's planning phase, some method development projects done by students, experiences received in laboratory operations, and the evaluation of student…
Reimagining Khan Analytics for Student Coaches
ERIC Educational Resources Information Center
Cunningham, Jim
2015-01-01
In this paper, I describe preliminary work on a new research project in learning analytics at Arizona State University. In conjunction with an innovative remedial mathematics course using Khan Academy and student coaches, this study seeks to measure the effectiveness of visualized data in assisting student coaches as they help remedial math…
Exploratory Analysis in Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
2013-12-10
NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Major Sean Lyons 5e. TASK NUMBER 5f. WORK...Advance Research Projects Agency DOD Department of Defense FM Field Manual IC Intelligence Community IO Information Operations IP Internet...Utah, Central Intelligence Agency funding of the Recorded Future Company, and Defense Advanced Research Projects Agency, XDATA project . 2
Performance criteria and quality indicators for the post-analytical phase.
Sciacovelli, Laura; Aita, Ada; Padoan, Andrea; Pelloso, Michela; Antonelli, Giorgia; Piva, Elisa; Chiozza, Maria Laura; Plebani, Mario
2016-07-01
Quality indicators (QIs) used as performance measurements are an effective tool in accurately estimating quality, identifying problems that may need to be addressed, and monitoring the processes over time. In Laboratory Medicine, QIs should cover all steps of the testing process, as error studies have confirmed that most errors occur in the pre- and post-analytical phase of testing. Aim of the present study is to provide preliminary results on QIs and related performance criteria in the post-analytical phase. This work was conducted according to a previously described study design based on the voluntary participation of clinical laboratories in the project on QIs of the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). Overall, data collected highlighted an improvement or stability in performances over time for all reported indicators thus demonstrating that the use of QIs is effective in the quality improvement strategy. Moreover, QIs data are an important source for defining the state-of-the-art concerning the error rate in the total testing process. The definition of performance specifications based on the state-of-the-art, as suggested by consensus documents, is a valuable benchmark point in evaluating the performance of each laboratory. Laboratory tests play a relevant role in the monitoring and evaluation of the efficacy of patient outcome thus assisting clinicians in decision-making. Laboratory performance evaluation is therefore crucial to providing patients with safe, effective and efficient care.
NASA Technical Reports Server (NTRS)
Sutliff, Daniel L.; Walker, Bruce E.
2014-01-01
An Ultrasonic Configurable Fan Artificial Noise Source (UCFANS) was designed, built, and tested in support of the NASA Langley Research Center's 14x22 wind tunnel test of the Hybrid Wing Body (HWB) full 3-D 5.8% scale model. The UCFANS is a 5.8% rapid prototype scale model of a high-bypass turbofan engine that can generate the tonal signature of proposed engines using artificial sources (no flow). The purpose of the program was to provide an estimate of the acoustic shielding benefits possible from mounting an engine on the upper surface of a wing; a flat plate model was used as the shielding surface. Simple analytical simulations were used to preview the radiation patterns - Fresnel knife-edge diffraction was coupled with a dense phased array of point sources to compute shielded and unshielded sound pressure distributions for potential test geometries and excitation modes. Contour plots of sound pressure levels, and integrated power levels, from nacelle alone and shielded configurations for both the experimental measurements and the analytical predictions are presented in this paper.
The TEF modeling and analysis approach to advance thermionic space power technology
NASA Astrophysics Data System (ADS)
Marshall, Albert C.
1997-01-01
Thermionics space power systems have been proposed as advanced power sources for future space missions that require electrical power levels significantly above the capabilities of current space power systems. The Defense Special Weapons Agency's (DSWA) Thermionic Evaluation Facility (TEF) is carrying out both experimental and analytical research to advance thermionic space power technology to meet this expected need. A Modeling and Analysis (M&A) project has been created at the TEF to develop analysis tools, evaluate concepts, and guide research. M&A activities are closely linked to the TEF experimental program, providing experiment support and using experimental data to validate models. A planning exercise has been completed for the M&A project, and a strategy for implementation was developed. All M&A activities will build on a framework provided by a system performance model for a baseline Thermionic Fuel Element (TFE) concept. The system model is composed of sub-models for each of the system components and sub-systems. Additional thermionic component options and model improvements will continue to be incorporated in the basic system model during the course of the program. All tasks are organized into four focus areas: 1) system models, 2) thermionic research, 3) alternative concepts, and 4) documentation and integration. The M&A project will provide a solid framework for future thermionic system development.
System and method for laser assisted sample transfer to solution for chemical analysis
Van Berkel, Gary J; Kertesz, Vilmos
2014-01-28
A system and method for laser desorption of an analyte from a specimen and capturing of the analyte in a suspended solvent to form a testing solution are described. The method can include providing a specimen supported by a desorption region of a specimen stage and desorbing an analyte from a target site of the specimen with a laser beam centered at a radiation wavelength (.lamda.). The desorption region is transparent to the radiation wavelength (.lamda.) and the sampling probe and a laser source emitting the laser beam are on opposite sides of a primary surface of the specimen stage. The system can also be arranged where the laser source and the sampling probe are on the same side of a primary surface of the specimen stage. The testing solution can then be analyzed using an analytical instrument or undergo further processing.
Immobilized aptamer paper spray ionization source for ion mobility spectrometry.
Zargar, Tahereh; Khayamian, Taghi; Jafari, Mohammad T
2017-01-05
A selective thin-film microextraction based on aptamer immobilized on cellulose paper was used as a paper spray ionization source for ion mobility spectrometry (PSI-IMS), for the first time. In this method, the paper is not only used as an ionization source but also it is utilized for the selective extraction of analyte, based on immobilized aptamer. This combination integrates both sample preparation and analyte ionization in a Whatman paper. To that end, an appropriate sample introduction system with a novel design was constructed for the paper spray ionization source. Using this system, a continuous solvent flow works as an elution and spray solvent simultaneously. In this method, analyte is adsorbed on a triangular paper with immobilized aptamer and then it is desorbed and ionized by elution solvent and applied high voltage on paper, respectively. The effects of different experimental parameters such as applied voltage, angle of paper tip, distance between paper tip and counter electrode, elution solvent type, and solvent flow rate were optimized. The proposed method was exhaustively validated in terms of sensitivity and reproducibility by analyzing the standard solutions of codeine and acetamiprid. The analytical results obtained are promising enough to ensure the use of immobilized aptamer paper-spray as both the extraction and ionization techniques in IMS for direct analysis of biomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.
Hybrid FSAE Vehicle Realization
DOT National Transportation Integrated Search
2010-12-01
The goal of this multi-year project is to create a fully functional University of Idaho entry in the hybrid FSAE competition. Vehicle integration is underway as part of a variety of 2010-11 senior design projects. This leverages a variety of analytic...
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
BLUES function method in computational physics
NASA Astrophysics Data System (ADS)
Indekeu, Joseph O.; Müller-Nedebock, Kristian K.
2018-04-01
We introduce a computational method in physics that goes ‘beyond linear use of equation superposition’ (BLUES). A BLUES function is defined as a solution of a nonlinear differential equation (DE) with a delta source that is at the same time a Green’s function for a related linear DE. For an arbitrary source, the BLUES function can be used to construct an exact solution to the nonlinear DE with a different, but related source. Alternatively, the BLUES function can be used to construct an approximate piecewise analytical solution to the nonlinear DE with an arbitrary source. For this alternative use the related linear DE need not be known. The method is illustrated in a few examples using analytical calculations and numerical computations. Areas for further applications are suggested.
Description of small-scale fluctuations in the diffuse X-ray background.
NASA Technical Reports Server (NTRS)
Cavaliere, A.; Friedland, A.; Gursky, H.; Spada, G.
1973-01-01
An analytical study of the fluctuations on a small angular scale expected in the diffuse X-ray background in the presence of unresolved sources is presented. The source population is described by a function N(S), giving the number of sources per unit solid angle and unit apparent flux S. The distribution of observed flux, s, in each angular resolution element of a complete sky survey is represented by a function Q(s). The analytical relation between the successive, higher-order moments of N(S) and Q(s) is described. The goal of reconstructing the source population from the study of the moments of Q(s) of order higher than the second (i.e., the rms fluctuations) is discussed.
NASA Astrophysics Data System (ADS)
Voelker, D.; De Martini, P. M.; Lastras, G.; Patera, A.; Hunt, J.; Terrinha, P.; Noiva, J.; Gutscher, M. A.; Migeon, S.
2015-12-01
EU project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe, Project number: 603839) aims at reaching a higher level of tsunami resilience in the North East Atlantic and Mediterranean (NEAM) region by a combination of field work, experimental work, numerical modeling and technical development. The project is a cooperative work of 26 institutes from 16 countries and links together the description of past tsunamigenic events, the characterization of tsunami sources, the calculation of the impact of such events, and the development of adequate resilience strategies (www.astarte.eu). Within ASTARTE a web-based data base on Mass Transport Deposit (MTD) in the NEAM areas is being created that claims to be the future reference source for this kind of research in Europe. The aim is to integrate every existing scientific reference on the topic and update on new entries every 3 months, hosting information and detailed data, that are crucial e.g for tsunami modeling. A relational database managed by ArcGIS for Desktop 10.3 software has been implemented to allow all partners to collaborate through a common platform for archiving and exchanging data and interpretations, such as MTD typology (slide, slump, debris, turbidite, etc), geometric characteristcs (location, depth, thickness, volume, slope, etc), but also age and dating method and eventually tsunamigenic potential. One of the final goals of the project is the sharing of the archived datasets through a web-based map service that will allow to visualize, question, analyze, and interpret all datasets. The interactive map service will be hosted by ArcGIS Online and will deploy the cloud capabilities of the portal. Any interested users will be able to access the online GIS resources through any Internet browser or ad hoc applications that run on desktop machines, smartphones, or tablets and will be able to use the analytical tools, key tasks, and workflows of the service.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
Analytical Chemistry Developmental Work Using a 243Am Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.
2015-02-24
This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .
Performance of a Fuel-Cell-Powered, Small Electric Airplane Assessed
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.
2004-01-01
Rapidly emerging fuel-cell-power technologies may be used to launch a new revolution of electric propulsion systems for light aircraft. Future small electric airplanes using fuel cell technologies hold the promise of high reliability, low maintenance, low noise, and - with the exception of water vapor - zero emissions. An analytical feasibility and performance assessment was conducted by NASA Glenn Research Center's Airbreathing Systems Analysis Office of a fuel-cell-powered, propeller-driven, small electric airplane based on a model of the MCR-01 two-place kitplane (Dyn'Aero, Darois, France). This assessment was conducted in parallel with an ongoing effort by the Advanced Technology Products Corporation and the Foundation for Advancing Science and Technology Education. Their project - partially funded by a NASA grant - is to design, build, and fly the first manned, continuously propelled, nongliding electric airplane. In our study, an analytical performance model of a proton exchange membrane (PEM) fuel cell propulsion system was developed and applied to a notional, two-place light airplane modeled after the MCR-01 kitplane. The PEM fuel cell stack was fed pure hydrogen fuel and humidified ambient air via a small automotive centrifugal supercharger. The fuel cell performance models were based on chemical reaction analyses calibrated with published data from the fledgling U.S. automotive fuel cell industry. Electric propeller motors, rated at two shaft power levels in separate assessments, were used to directly drive a two-bladed, variable-pitch propeller. Fuel sources considered were compressed hydrogen gas and cryogenic liquid hydrogen. Both of these fuel sources provided pure, contaminant-free hydrogen for the PEM cells.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.
2012-12-01
The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.
ERIC Educational Resources Information Center
Kapor, Mitchell
2005-01-01
Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…
MRMPlus: an open source quality control and assessment tool for SRM/MRM assay development.
Aiyetan, Paul; Thomas, Stefani N; Zhang, Zhen; Zhang, Hui
2015-12-12
Selected and multiple reaction monitoring involves monitoring a multiplexed assay of proteotypic peptides and associated transitions in mass spectrometry runs. To describe peptide and associated transitions as stable, quantifiable, and reproducible representatives of proteins of interest, experimental and analytical validation is required. However, inadequate and disparate analytical tools and validation methods predispose assay performance measures to errors and inconsistencies. Implemented as a freely available, open-source tool in the platform independent Java programing language, MRMPlus computes analytical measures as recommended recently by the Clinical Proteomics Tumor Analysis Consortium Assay Development Working Group for "Tier 2" assays - that is, non-clinical assays sufficient enough to measure changes due to both biological and experimental perturbations. Computed measures include; limit of detection, lower limit of quantification, linearity, carry-over, partial validation of specificity, and upper limit of quantification. MRMPlus streamlines assay development analytical workflow and therefore minimizes error predisposition. MRMPlus may also be used for performance estimation for targeted assays not described by the Assay Development Working Group. MRMPlus' source codes and compiled binaries can be freely downloaded from https://bitbucket.org/paiyetan/mrmplusgui and https://bitbucket.org/paiyetan/mrmplusgui/downloads respectively.
An Open-source Community Web Site To Support Ground-Water Model Testing
NASA Astrophysics Data System (ADS)
Kraemer, S. R.; Bakker, M.; Craig, J. R.
2007-12-01
A community wiki wiki web site has been created as a resource to support ground-water model development and testing. The Groundwater Gourmet wiki is a repository for user supplied analytical and numerical recipes, howtos, and examples. Members are encouraged to submit analytical solutions, including source code and documentation. A diversity of code snippets are sought in a variety of languages, including Fortran, C, C++, Matlab, Python. In the spirit of a wiki, all contributions may be edited and altered by other users, and open source licensing is promoted. Community accepted contributions are graduated into the library of analytic solutions and organized into either a Strack (Groundwater Mechanics, 1989) or Bruggeman (Analytical Solutions of Geohydrological Problems, 1999) classification. The examples section of the wiki are meant to include laboratory experiments (e.g., Hele Shaw), classical benchmark problems (e.g., Henry Problem), and controlled field experiments (e.g., Borden landfill and Cape Cod tracer tests). Although this work was reviewed by EPA and approved for publication, it may not necessarily reflect official Agency policy. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.
Temperature distribution of a simplified rotor due to a uniform heat source
NASA Astrophysics Data System (ADS)
Welzenbach, Sarah; Fischer, Tim; Meier, Felix; Werner, Ewald; kyzy, Sonun Ulan; Munz, Oliver
2018-03-01
In gas turbines, high combustion efficiency as well as operational safety are required. Thus, labyrinth seal systems with honeycomb liners are commonly used. In the case of rubbing events in the seal system, the components can be damaged due to cyclic thermal and mechanical loads. Temperature differences occurring at labyrinth seal fins during rubbing events can be determined by considering a single heat source acting periodically on the surface of a rotating cylinder. Existing literature analysing the temperature distribution on rotating cylindrical bodies due to a stationary heat source is reviewed. The temperature distribution on the circumference of a simplified labyrinth seal fin is calculated using an available and easy to implement analytical approach. A finite element model of the simplified labyrinth seal fin is created and the numerical results are compared to the analytical results. The temperature distributions calculated by the analytical and the numerical approaches coincide for low sliding velocities, while there are discrepancies of the calculated maximum temperatures for higher sliding velocities. The use of the analytical approach allows the conservative estimation of the maximum temperatures arising in labyrinth seal fins during rubbing events. At the same time, high calculation costs can be avoided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Consonni, Stefano, E-mail: stefano.consonni@polimi.it; Giugliano, Michele; Massarutto, Antonio
Highlights: > The source separation level (SSL) of waste management system does not qualify adequately the system. > Separately collecting organic waste gives less advantages than packaging materials. > Recycling packaging materials (metals, glass, plastics, paper) is always attractive. > Composting and anaerobic digestion of organic waste gives questionable outcomes. > The critical threshold of optimal recycling seems to be a SSL of 50%. - Abstract: This paper describes the context, the basic assumptions and the main findings of a joint research project aimed at identifying the optimal breakdown between material recovery and energy recovery from municipal solid waste (MSW)more » in the framework of integrated waste management systems (IWMS). The project was carried out from 2007 to 2009 by five research groups at Politecnico di Milano, the Universities of Bologna and Trento, and the Bocconi University (Milan), with funding from the Italian Ministry of Education, University and Research (MIUR). Since the optimization of IWMSs by analytical methods is practically impossible, the search for the most attractive strategy was carried out by comparing a number of relevant recovery paths from the point of view of mass and energy flows, technological features, environmental impact and economics. The main focus has been on mature processes applicable to MSW in Italy and Europe. Results show that, contrary to a rather widespread opinion, increasing the source separation level (SSL) has a very marginal effects on energy efficiency. What does generate very significant variations in energy efficiency is scale, i.e. the size of the waste-to-energy (WTE) plant. The mere value of SSL is inadequate to qualify the recovery system. The energy and environmental outcome of recovery depends not only on 'how much' source separation is carried out, but rather on 'how' a given SSL is reached.« less
NASA Astrophysics Data System (ADS)
Priya, Anjali; Mishra, Ram Awadh
2016-04-01
In this paper, analytical modeling of surface potential is proposed for new Triple Metal Gate (TMG) fully depleted Recessed-Source/Dain Silicon On Insulator (SOI) Metal Oxide Semiconductor Field Effect Transistor (MOSFET). The metal with the highest work function is arranged near the source region and the lowest one near the drain. Since Recessed-Source/Drain SOI MOSFET has higher drain current as compared to conventional SOI MOSFET due to large source and drain region. The surface potential model developed by 2D Poisson's equation is verified by comparison to the simulation result of 2-dimensional ATLAS simulator. The model is compared with DMG and SMG devices and analysed for different device parameters. The ratio of metal gate length is varied to optimize the result.
DOE Office of Scientific and Technical Information (OSTI.GOV)
none, none; Tuchman, Nancy
The U.S. Department of Energy awarded Loyola University Chicago and the Institute of Environmental Sustainability (IES) $486,000.00 for the proposal entitled “Chicago clean air, clean water project: Environmental monitoring for a healthy, sustainable urban future.” The project supported the purchase of analytical instruments for the development of an environmental analytical laboratory. The analytical laboratory is designed to support the testing of field water and soil samples for nutrients, industrial pollutants, heavy metals, and agricultural toxins, with special emphasis on testing Chicago regional soils and water affected by coal-based industry. Since the award was made in 2010, the IES has beenmore » launched (fall 2013), and the IES acquired a new state-of-the-art research and education facility on Loyola University Chicago’s Lakeshore campus. Two labs were included in the research and education facility. The second floor lab is the Ecology Laboratory where lab experiments and analyses are conducted on soil, plant, and water samples. The third floor lab is the Environmental Toxicology Lab where lab experiments on environmental toxins are conducted, as well as analytical tests conducted on water, soil, and plants. On the south end of the Environmental Toxicology Lab is the analytical instrumentation collection purchased from the present DOE grant, which is overseen by a full time Analytical Chemist (hired January 2016), who maintains the instruments, conducts analyses on samples, and helps to train faculty and undergraduate and graduate student researchers.« less
Wang, Jun-Wen; Liu, Yang; Tong, Yuan-Yuan; Yang, Ce; Li, Hai-Yan
2016-05-01
This study collected 1995-2014 molecular pharmacognosy study, a total of 595 items, funded by Natural Science Foundation of China (NSFC). TDA and Excel software were used to analyze the data of the projects about general situation, hot spots of research with rank analytic and correlation analytic methods. Supported by NSFC molecular pharmacognosy projects and funding a gradual increase in the number of, the proportion of funds for pharmaceutical research funding tends to be stable; mainly supported by molecular biology methods of genuine medicinal materials, secondary metabolism and Germplasm Resources Research; hot drugs including Radix Salviae Miltiorrhizae, Radix Rehmanniae, Cordyceps sinensis, hot contents including tanshinone biosynthesis, Rehmannia glutinosa continuous cropping obstacle. Copyright© by the Chinese Pharmaceutical Association.
Second derivatives for approximate spin projection methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Lee M.; Hratchian, Hrant P., E-mail: hhratchian@ucmerced.edu
2015-02-07
The use of broken-symmetry electronic structure methods is required in order to obtain correct behavior of electronically strained open-shell systems, such as transition states, biradicals, and transition metals. This approach often has issues with spin contamination, which can lead to significant errors in predicted energies, geometries, and properties. Approximate projection schemes are able to correct for spin contamination and can often yield improved results. To fully make use of these methods and to carry out exploration of the potential energy surface, it is desirable to develop an efficient second energy derivative theory. In this paper, we formulate the analytical secondmore » derivatives for the Yamaguchi approximate projection scheme, building on recent work that has yielded an efficient implementation of the analytical first derivatives.« less
The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies
NASA Astrophysics Data System (ADS)
Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.
2016-08-01
The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.
NASA Astrophysics Data System (ADS)
Barrett, Steven R. H.; Britter, Rex E.
Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.
Hobbie, Kevin A; Peterson, Elena S; Barton, Michael L; Waters, Katrina M; Anderson, Kim A
2012-08-01
Large collaborative centers are a common model for accomplishing integrated environmental health research. These centers often include various types of scientific domains (e.g., chemistry, biology, bioinformatics) that are integrated to solve some of the nation's key economic or public health concerns. The Superfund Research Center (SRP) at Oregon State University (OSU) is one such center established in 2008 to study the emerging health risks of polycyclic aromatic hydrocarbons while using new technologies both in the field and laboratory. With outside collaboration at remote institutions, success for the center as a whole depends on the ability to effectively integrate data across all research projects and support cores. Therefore, the OSU SRP center developed a system that integrates environmental monitoring data with analytical chemistry data and downstream bioinformatics and statistics to enable complete "source-to-outcome" data modeling and information management. This article describes the development of this integrated information management system that includes commercial software for operational laboratory management and sample management in addition to open-source custom-built software for bioinformatics and experimental data management.
Hobbie, Kevin A.; Peterson, Elena S.; Barton, Michael L.; Waters, Katrina M.; Anderson, Kim A.
2012-01-01
Large collaborative centers are a common model for accomplishing integrated environmental health research. These centers often include various types of scientific domains (e.g. chemistry, biology, bioinformatics) that are integrated to solve some of the nation’s key economic or public health concerns. The Superfund Research Center (SRP) at Oregon State University (OSU) is one such center established in 2008 to study the emerging health risks of polycyclic aromatic hydrocarbons while utilizing new technologies both in the field and laboratory. With outside collaboration at remote institutions, success for the center as a whole depends on the ability to effectively integrate data across all research projects and support cores. Therefore, the OSU SRP center developed a system that integrates environmental monitoring data with analytical chemistry data and downstream bioinformatics and statistics to enable complete ‘source to outcome’ data modeling and information management. This article describes the development of this integrated information management system that includes commercial software for operational laboratory management and sample management in addition to open source custom built software for bioinformatics and experimental data management. PMID:22651935
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
An Introduction to Project PRIME and CAMPUS MINNESOTA. Project PRIME Report, Number 2.
ERIC Educational Resources Information Center
Cordes, David C.
PRIME is an acronym for Planning Resources in Minnesota Education. The project's primary objective is to test the implementation of CAMPUS (Comprehensive Analytical Methods for Planning University Systems) in one State College, one Junior College, and in one school at the University of Minnesota. The CAMPUS model was developed by the Institute for…
The Challenge of Separating Effects of Simultaneous Education Projects on Student Achievement
ERIC Educational Resources Information Center
Ma, Xin; Ma, Lingling
2009-01-01
When multiple education projects operate in an overlapping or rear-ended manner, it is always a challenge to separate unique project effects on schooling outcomes. Our analysis represents a first attempt to address this challenge. A three-level hierarchical linear model (HLM) was presented as a general analytical framework to separate program…
Alcohol safety action projects evaluation of operations : data, table of results, and formulation
DOT National Transportation Integrated Search
1979-06-01
This volume contains the data used in the evaluation of 35 Alcohol Safety Action Projects implemented throughout the country. Historical background, discussion of analytic results and factors affecting impact detecion are contained in the document ti...
Validation of urban freeway models. [supporting datasets
DOT National Transportation Integrated Search
2015-01-01
The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...
Ludtke, Amy S.; Woodworth, Mark T.; Marsh, Philip S.
2000-01-01
The U.S. Geological Survey operates a quality-assurance program based on the analyses of reference samples for two laboratories: the National Water Quality Laboratory and the Quality of Water Service Unit. Reference samples that contain selected inorganic, nutrient, and low-level constituents are prepared and submitted to the laboratory as disguised routine samples. The program goal is to estimate precision and bias for as many analytical methods offered by the participating laboratories as possible. Blind reference samples typically are submitted at a rate of 2 to 5 percent of the annual environmental-sample load for each constituent. The samples are distributed to the laboratories throughout the year. The reference samples are subject to the identical laboratory handling, processing, and analytical procedures as those applied to environmental samples and, therefore, have been used as an independent source to verify bias and precision of laboratory analytical methods and ambient water-quality measurements. The results are stored permanently in the National Water Information System and the Blind Sample Project's data base. During water year 1998, 95 analytical procedures were evaluated at the National Water Quality Laboratory and 63 analytical procedures were evaluated at the Quality of Water Service Unit. An overall evaluation of the inorganic and low-level constituent data for water year 1998 indicated 77 of 78 analytical procedures at the National Water Quality Laboratory met the criteria for precision. Silver (dissolved, inductively coupled plasma-mass spectrometry) was determined to be imprecise. Five of 78 analytical procedures showed bias throughout the range of reference samples: chromium (dissolved, inductively coupled plasma-atomic emission spectrometry), dissolved solids (dissolved, gravimetric), lithium (dissolved, inductively coupled plasma-atomic emission spectrometry), silver (dissolved, inductively coupled plasma-mass spectrometry), and zinc (dissolved, inductively coupled plasma-mass spectrometry). At the National Water Quality Laboratory during water year 1998, lack of precision was indicated for 2 of 17 nutrient procedures: ammonia as nitrogen (dissolved, colorimetric) and orthophosphate as phosphorus (dissolved, colorimetric). Bias was indicated throughout the reference sample range for ammonia as nitrogen (dissolved, colorimetric, low level) and nitrate plus nitrite as nitrogen (dissolved, colorimetric, low level). All analytical procedures tested at the Quality of Water Service Unit during water year 1998 met the criteria for precision. One of the 63 analytical procedures indicated a bias throughout the range of reference samples: aluminum (whole-water recoverable, inductively coupled plasma-atomic emission spectrometry, trace).
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
NASA Technical Reports Server (NTRS)
Mcnulty, J. F.
1974-01-01
An analysis of the history and background of the Mars Project Viking is presented. The organization and functions of the engineering group responsible for the project are defined. The design and configuration of the proposed space vehicle are examined. Illustrations and tables of data are provided to complete the coverage of the project.
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, A.G.
The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membershipmore » is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program.« less
Comparison of Three Plasma Sources for Ambient Desorption/Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
McKay, Kirsty; Salter, Tara L.; Bowfield, Andrew; Walsh, James L.; Gilmore, Ian S.; Bradley, James W.
2014-09-01
Plasma-based desorption/ionization sources are an important ionization technique for ambient surface analysis mass spectrometry. In this paper, we compare and contrast three competing plasma based desorption/ionization sources: a radio-frequency (rf) plasma needle, a dielectric barrier plasma jet, and a low-temperature plasma probe. The ambient composition of the three sources and their effectiveness at analyzing a range of pharmaceuticals and polymers were assessed. Results show that the background mass spectrum of each source was dominated by air species, with the rf needle producing a richer ion spectrum consisting mainly of ionized water clusters. It was also seen that each source produced different ion fragments of the analytes under investigation: this is thought to be due to different substrate heating, different ion transport mechanisms, and different electric field orientations. The rf needle was found to fragment the analytes least and as a result it was able to detect larger polymer ions than the other sources.
Comparison of three plasma sources for ambient desorption/ionization mass spectrometry.
McKay, Kirsty; Salter, Tara L; Bowfield, Andrew; Walsh, James L; Gilmore, Ian S; Bradley, James W
2014-09-01
Plasma-based desorption/ionization sources are an important ionization technique for ambient surface analysis mass spectrometry. In this paper, we compare and contrast three competing plasma based desorption/ionization sources: a radio-frequency (rf) plasma needle, a dielectric barrier plasma jet, and a low-temperature plasma probe. The ambient composition of the three sources and their effectiveness at analyzing a range of pharmaceuticals and polymers were assessed. Results show that the background mass spectrum of each source was dominated by air species, with the rf needle producing a richer ion spectrum consisting mainly of ionized water clusters. It was also seen that each source produced different ion fragments of the analytes under investigation: this is thought to be due to different substrate heating, different ion transport mechanisms, and different electric field orientations. The rf needle was found to fragment the analytes least and as a result it was able to detect larger polymer ions than the other sources.
THE IMPACT OF POINT-SOURCE SUBTRACTION RESIDUALS ON 21 cm EPOCH OF REIONIZATION ESTIMATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J., E-mail: cathryn.trott@curtin.edu.au
Precise subtraction of foreground sources is crucial for detecting and estimating 21 cm H I signals from the Epoch of Reionization (EoR). We quantify how imperfect point-source subtraction due to limitations of the measurement data set yields structured residual signal in the data set. We use the Cramer-Rao lower bound, as a metric for quantifying the precision with which a parameter may be measured, to estimate the residual signal in a visibility data set due to imperfect point-source subtraction. We then propagate these residuals into two metrics of interest for 21 cm EoR experiments-the angular power spectrum and two-dimensional powermore » spectrum-using a combination of full analytic covariant derivation, analytic variant derivation, and covariant Monte Carlo simulations. This methodology differs from previous work in two ways: (1) it uses information theory to set the point-source position error, rather than assuming a global rms error, and (2) it describes a method for propagating the errors analytically, thereby obtaining the full correlation structure of the power spectra. The methods are applied to two upcoming low-frequency instruments that are proposing to perform statistical EoR experiments: the Murchison Widefield Array and the Precision Array for Probing the Epoch of Reionization. In addition to the actual antenna configurations, we apply the methods to minimally redundant and maximally redundant configurations. We find that for peeling sources above 1 Jy, the amplitude of the residual signal, and its variance, will be smaller than the contribution from thermal noise for the observing parameters proposed for upcoming EoR experiments, and that optimal subtraction of bright point sources will not be a limiting factor for EoR parameter estimation. We then use the formalism to provide an ab initio analytic derivation motivating the 'wedge' feature in the two-dimensional power spectrum, complementing previous discussion in the literature.« less
ERIC Educational Resources Information Center
Gao, Ruomei
2015-01-01
In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…
DOT National Transportation Integrated Search
2009-12-01
The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...
ERIC Educational Resources Information Center
Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David
2014-01-01
Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
DOT National Transportation Integrated Search
2012-03-01
This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...
NASA Astrophysics Data System (ADS)
Yuan, Li-Yun; Xiang, Yu; Lu, Jing; Jiang, Hong-Hua
2015-12-01
Based on the transfer matrix method of exploring the circular cylindrical shell treated with active constrained layer damping (i.e., ACLD), combined with the analytical solution of the Helmholtz equation for a point source, a multi-point multipole virtual source simulation method is for the first time proposed for solving the acoustic radiation problem of a submerged ACLD shell. This approach, wherein some virtual point sources are assumed to be evenly distributed on the axial line of the cylindrical shell, and the sound pressure could be written in the form of the sum of the wave functions series with the undetermined coefficients, is demonstrated to be accurate to achieve the radiation acoustic pressure of the pulsating and oscillating spheres respectively. Meanwhile, this approach is proved to be accurate to obtain the radiation acoustic pressure for a stiffened cylindrical shell. Then, the chosen number of the virtual distributed point sources and truncated number of the wave functions series are discussed to achieve the approximate radiation acoustic pressure of an ACLD cylindrical shell. Applying this method, different radiation acoustic pressures of a submerged ACLD cylindrical shell with different boundary conditions, different thickness values of viscoelastic and piezoelectric layer, different feedback gains for the piezoelectric layer and coverage of ACLD are discussed in detail. Results show that a thicker thickness and larger velocity gain for the piezoelectric layer and larger coverage of the ACLD layer can obtain a better damping effect for the whole structure in general. Whereas, laying a thicker viscoelastic layer is not always a better treatment to achieve a better acoustic characteristic. Project supported by the National Natural Science Foundation of China (Grant Nos. 11162001, 11502056, and 51105083), the Natural Science Foundation of Guangxi Zhuang Autonomous Region, China (Grant No. 2012GXNSFAA053207), the Doctor Foundation of Guangxi University of Science and Technology, China (Grant No. 12Z09), and the Development Project of the Key Laboratory of Guangxi Zhuang Autonomous Region, China (Grant No. 1404544).
An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation
NASA Astrophysics Data System (ADS)
Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi
2015-04-01
Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).
Building CHAOS: An Operating System for Livermore Linux Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlick, J E; Dunlap, C M
2003-02-21
The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol (OPeNDAP) standard for gridded time series, the OGC’s Web Coverage Service for high-density static gridded data, and Unidata’s CDM-remote for point time series. OGC WFS and Sensor Observation Service (SOS) are being explored as mechanisms to serve and access static or time series data attributed to vector geometry. A set of standardized XML-based output formats allows easy transformation into a wide variety of “model-ready” formats. Interested users will have the option of submitting custom transformations to the GDP or transforming the XML output as a post-process. The GDP project aims to support simple, rapid development of thin user interfaces (like web portals) to commonly needed environmental modeling-related data access and manipulation tools. Standalone, service-oriented components of the GDP framework provide the metadata cataloging, data subset access, and spatial-statistics calculations needed to support interdisciplinary environmental modeling.
A multiscale filter for noise reduction of low-dose cone beam projections.
Yao, Weiguang; Farr, Jonathan B
2015-08-21
The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, exp(-x2/2σ(2)(f)) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of σ(f), which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ(2)(f)) is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024 × 768 pixels.
The Exercise: An Exercise Generator Tool for the SOURCe Project
ERIC Educational Resources Information Center
Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios
2016-01-01
The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…
SolTrace | Concentrating Solar Power | NREL
NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted
The Independent Technical Analysis Process Final Report 2006-2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey; Ham, Kenneth; Dauble, Dennis
2007-03-01
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities. The Independent Technical Analysis Process (ITAP) was created to provide non-routine analysis for fish and wildlife agencies and tribes in particular and the public in general on matters related tomore » juvenile and adult salmon and steelhead passage through the mainstem hydrosystem. The process was designed to maintain the independence of analysts and reviewers from parties requesting analyses, to avoid potential bias in technical products. The objectives identified for this project were to administer a rigorous, transparent process to deliver unbiased technical assistance necessary to coordinate recommendations for storage reservoir and river operations that avoid potential conflicts between anadromous and resident fish. Seven work elements, designated by numbered categories in the Pisces project tracking system, were created to define and accomplish project goals as follows: (1) 118 Coordination - Coordinate technical analysis and review process: (a) Retain expertise for analyst/reviewer roles. (b) Draft research directives. (c) Send directive to the analyst. (d) Coordinate two independent reviews of the draft report. (e) Ensure reviewer comments are addressed within the final report. (2) 162 Analyze/Interpret Data - Implement the independent aspects of the project. (3) 122 Provide Technical Review - Implement the review process for the analysts. (4) 132 Produce Annual Report - FY06 annual progress report with Pisces Disseminate (5) 161 Disseminate Raw/Summary Data and Results - Post technical products on the ITAP web site. (6) 185-Produce Pisces Status Report - Provide periodic status reports to BPA. (7) 119 Manage and Administer Projects - project/contract administration.« less
Distribution factors for construction loads and girder capacity equations [project summary].
DOT National Transportation Integrated Search
2017-03-01
This project focused on the use of Florida I-beams (FIBs) in bridge construction. University of Florida researchers used analytical models and finite element analysis to update equations used in the design of bridges using FIBs. They were particularl...
Sampling and Analysis Plan - Guidance and Template v.4 - General Projects - 04/2014
This Sampling and Analysis Plan (SAP) guidance and template is intended to assist organizations in documenting the procedural and analytical requirements for one-time, or time-limited, projects involving the collection of water, soil, sediment, or other
NASA Astrophysics Data System (ADS)
Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.
2017-12-01
We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.
On precisely modelling surface deformation due to interacting magma chambers and dykes
NASA Astrophysics Data System (ADS)
Pascal, Karen; Neuberg, Jurgen; Rivalta, Eleonora
2014-01-01
Combined data sets of InSAR and GPS allow us to observe surface deformation in volcanic settings. However, at the vast majority of volcanoes, a detailed 3-D structure that could guide the modelling of deformation sources is not available, due to the lack of tomography studies, for example. Therefore, volcano ground deformation due to magma movement in the subsurface is commonly modelled using simple point (Mogi) or dislocation (Okada) sources, embedded in a homogeneous, isotropic and elastic half-space. When data sets are too complex to be explained by a single deformation source, the magmatic system is often represented by a combination of these sources and their displacements fields are simply summed. By doing so, the assumption of homogeneity in the half-space is violated and the resulting interaction between sources is neglected. We have quantified the errors of such a simplification and investigated the limits in which the combination of analytical sources is justified. We have calculated the vertical and horizontal displacements for analytical models with adjacent deformation sources and have tested them against the solutions of corresponding 3-D finite element models, which account for the interaction between sources. We have tested various double-source configurations with either two spherical sources representing magma chambers, or a magma chamber and an adjacent dyke, modelled by a rectangular tensile dislocation or pressurized crack. For a tensile Okada source (representing an opening dyke) aligned or superposed to a Mogi source (magma chamber), we find the discrepancies with the numerical models to be insignificant (<5 per cent) independently of the source separation. However, if a Mogi source is placed side by side to an Okada source (in the strike-perpendicular direction), we find the discrepancies to become significant for a source separation less than four times the radius of the magma chamber. For horizontally or vertically aligned pressurized sources, the discrepancies are up to 20 per cent, which translates into surprisingly large errors when inverting deformation data for source parameters such as depth and volume change. Beyond 8 radii however, we demonstrate that the summation of analytical sources represents adjacent magma chambers correctly.
AKM in Open Source Communities
NASA Astrophysics Data System (ADS)
Stamelos, Ioannis; Kakarontzas, George
Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.
ERIC Educational Resources Information Center
Vlas, Radu Eduard
2012-01-01
Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…
Orejas, Jaime; Pfeuffer, Kevin P; Ray, Steven J; Pisonero, Jorge; Sanz-Medel, Alfredo; Hieftje, Gary M
2014-11-01
Ambient desorption/ionization (ADI) sources coupled to mass spectrometry (MS) offer outstanding analytical features: direct analysis of real samples without sample pretreatment, combined with the selectivity and sensitivity of MS. Since ADI sources typically work in the open atmosphere, ambient conditions can affect the desorption and ionization processes. Here, the effects of internal source parameters and ambient humidity on the ionization processes of the flowing atmospheric pressure afterglow (FAPA) source are investigated. The interaction of reagent ions with a range of analytes is studied in terms of sensitivity and based upon the processes that occur in the ionization reactions. The results show that internal parameters which lead to higher gas temperatures afforded higher sensitivities, although fragmentation is also affected. In the case of humidity, only extremely dry conditions led to higher sensitivities, while fragmentation remained unaffected.
100-F Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
100-K Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
An object oriented fully 3D tomography visual toolkit.
Agostinelli, S; Paoli, G
2001-04-01
In this paper we present a modern object oriented component object model (COMM) C + + toolkit dedicated to fully 3D cone-beam tomography. The toolkit allows the display and visual manipulation of analytical phantoms, projection sets and volumetric data through a standard Windows graphical user interface. Data input/output is performed using proprietary file formats but import/export of industry standard file formats, including raw binary, Windows bitmap and AVI, ACR/NEMA DICOMM 3 and NCSA HDF is available. At the time of writing built-in implemented data manipulators include a basic phantom ray-tracer and a Matrox Genesis frame grabbing facility. A COMM plug-in interface is provided for user-defined custom backprojector algorithms: a simple Feldkamp ActiveX control, including source code, is provided as an example; our fast Feldkamp plug-in is also available.
Evaluating the causes of photovoltaics cost reduction: Why is PV different?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trancik, Jessika; McNerney, James; Kavlak, Goksin
The goals of this project were to quantify sources of cost reduction in photovoltaics (PV), improve theories of technological evolution, develop new analytical methods, and formu- late guidelines for continued cost reduction in photovoltaics. A number of explanations have been suggested for why photovoltaics have come down in cost rapidly over time, including increased production rates, significant R&D expenditures, heavy patenting ac- tivity, decreasing material and input costs, scale economies, reduced plant construction costs, and higher conversion efficiencies. We classified these proposed causes into low- level factors and high-level drivers. Low-level factors include technical characteristics, such as module efficiency ormore » wafer area, which are easily posed in terms of variables of a cost equation. High-level factors include scale economies, research and development (R&D), and learning-by-doing.« less
Intersubjectivity and the creation of meaning in the analytic process.
Maier, Christian
2014-11-01
By means of a clinical illustration, the author describes how the intersubjective exchanges involved in an analytic process facilitate the representation of affects and memories which have been buried in the unconscious or indeed have never been available to consciousness. As a result of projective identificatory processes in the analytic relationship, in this example the analyst falls into a situation of helplessness which connects with his own traumatic experiences. Then he gets into a formal regression of the ego and responds with a so-to-speak hallucinatory reaction-an internal image which enables him to keep the analytic process on track and, later on, to construct an early traumatic experience of the analysand. © 2014, The Society of Analytical Psychology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, R.S.; Kong, E.J.; Bahner, M.A.
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.
NASA Astrophysics Data System (ADS)
Zhong, Xian-Qiong; Zhang, Xiao-Xia; Du, Xian-Tong; Liu, Yong; Cheng, Ke
2015-10-01
The approximate analytical frequency chirps and the critical distances for cross-phase modulation induced optical wave breaking (OWB) of the initial hyperbolic-secant optical pulses propagating in optical fibers with quintic nonlinearity (QN) are presented. The pulse evolutions in terms of the frequency chirps, shapes and spectra are numerically calculated in the normal dispersion regime. The results reveal that, depending on different QN parameters, the traditional OWB or soliton or soliton pulse trains may occur. The approximate analytical critical distances are found to be in good agreement with the numerical ones only for the traditional OWB whereas the approximate analytical frequency chirps accords well with the numerical ones at the initial evolution stages of the pulses. Supported by the Postdoctoral Fund of China under Grant No. 2011M501402, the Key Project of Chinese Ministry of Education under Grant No. 210186, the Major Project of Natural Science Supported by the Educational Department of Sichuan Province under Grant No. 13ZA0081, the Key Project of National Natural Science Foundation of China under Grant No 61435010, and the National Natural Science Foundation of China under Grant No. 61275039
Advanced Design Features of APR1400 and Realization in Shin Kori Construction Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
OH, S.J.; Park, K.C.; Kim, H.G.
2006-07-01
APR1400 adopted several advanced design features. To ensure their proper operation as a part of ShinKori 3,4 project, both experimental and analytical work are continuing. In this paper, work on the advanced design features related to enhanced safety is examined. APR1400 safety injection system consists of four independent trains which include four safety injection pump and tanks. A passive flow regulating device called fluidic device is installed in the safety injection tanks. Separate effect tests including a full scale fluidic device tests have been conducted. Integral system tests are in progress. Combination of these work with the analytical work usingmore » RELAP5/Mod3 would ensure the proper operation of the new safety injection systems. To mitigate severe accidents, hydrogen mitigation system using PARs and igniters is adopted. Also, active injection system and the streamlined insulation design are adopted to enhance the in-vessel retention capability with the external cooling of RPV strategy. Analytic work with supporting experiments is performed. We are certain that these preparatory work would help the successful adaptation of ADF in ShinKori project. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Robert T.
Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.
Narrow band noise response of a Belleville spring resonator.
Lyon, Richard H
2013-09-01
This study of nonlinear dynamics includes (i) an identification of quasi-steady states of response using equivalent linearization, (ii) the temporal simulation of the system using Heun's time step procedure on time domain analytic signals, and (iii) a laboratory experiment. An attempt has been made to select material and measurement parameters so that nearly the same systems are used and analyzed for all three parts of the study. This study illustrates important features of nonlinear response to narrow band excitation: (a) states of response that the system can acquire with transitions of the system between those states, (b) the interaction between the noise source and the vibrating load in which the source transmits energy to or draws energy from the load as transitions occur; (c) the lag or lead of the system response relative to the source as transitions occur that causes the average frequencies of source and response to differ; and (d) the determination of the state of response (mass or stiffness controlled) by observation of the instantaneous phase of the influence function. These analyses take advantage of the use of time domain analytic signals that have a complementary role to functions that are analytic in the frequency domain.
Line-source excitation of realistic conformal metasurface cloaks
NASA Astrophysics Data System (ADS)
Padooru, Yashwanth R.; Yakovlev, Alexander B.; Chen, Pai-Yen; Alù, Andrea
2012-11-01
Following our recently introduced analytical tools to model and design conformal mantle cloaks based on metasurfaces [Padooru et al., J. Appl. Phys. 112, 034907 (2012)], we investigate their performance and physical properties when excited by an electric line source placed in their close proximity. We consider metasurfaces formed by 2-D arrays of slotted (meshes and Jerusalem cross slots) and printed (patches and Jerusalem crosses) sub-wavelength elements. The electromagnetic scattering analysis is carried out using a rigorous analytical model, which utilizes the two-sided impedance boundary conditions at the interface of the sub-wavelength elements. It is shown that the homogenized grid-impedance expressions, originally derived for planar arrays of sub-wavelength elements and plane-wave excitation, may be successfully used to model and tailor the surface reactance of cylindrical conformal mantle cloaks illuminated by near-field sources. Our closed-form analytical results are in good agreement with full-wave numerical simulations, up to sub-wavelength distances from the metasurface, confirming that mantle cloaks may be very effective to suppress the scattering of moderately sized objects, independent of the type of excitation and point of observation. We also discuss the dual functionality of these metasurfaces to boost radiation efficiency and directivity from confined near-field sources.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
ERIC Educational Resources Information Center
Scott, Patrick B., Ed.
1991-01-01
REDUC is a cooperative network of some 23 associated centers in 17 Latin American and Caribbean countries. The REDUC coordinating center is located in Santiago, Chile. REDUC produces a bibliographic database containing analytical summaries (approximately 800 items annually) of the most important research studies and project descriptions in the…
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
ERIC Educational Resources Information Center
Rouchouse, Marine; Faysse, Nicolas; De Romemont, Aurelle; Moumouni, Ismail; Faure, Guy
2015-01-01
Purpose: Approaches to build farmers' analytical capacities are said to trigger wide-ranging changes. This article reports on the communication process between participants and non-participants in one such approach, related to the technical and management skills learned by participants and the changes these participants subsequently made, and the…
Method and apparatus for simultaneous spectroelectrochemical analysis
Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R
2013-11-19
An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.
Incomplete Intelligence: Is the Information Sharing Environment an Effective Platform?
2012-09-01
Initiative NYFD New York Fire Department NYPD New York Police Department OLAP On Line Analytics Processing OSINT Open Source Intelligence...Intelligence ( OSINT ), from public websites, media sources, and other unclassified events and reports. Although some of these sources do not have a direct
WELLHEAD ANALYTIC ELEMENT MODEL FOR WINDOWS
WhAEM2000 (wellhead analytic element model for Win 98/00/NT/XP) is a public domain, ground-water flow model designed to facilitate capture zone delineation and protection area mapping in support of the State's and Tribe's Wellhead Protection Programs (WHPP) and Source Water Asses...
Werber, D; Bernard, H
2014-02-27
Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.
NASA Astrophysics Data System (ADS)
Ferguson, Henry
With the end of the Herschel mission and no immediate successor at far-infrared wavelengths, it is imperative to extract as much information as possible from the existing data. The difference between the theoretical noise limit and the confusion limit suggests that significant improvements can be made with a more sophisticated treatment of source confusion. This is possible because we have a lot of information about the Herschel deep fields from other wavelengths. The project will use existing already-reduced data from Herschel's deepest observations, which targeted the CANDELS. These data have a wealth of observations from Hubble, Spitzer, Chandra and many other telescopes. The main work will be to develop and employ a new Bayesian technique that incorporates spectral-energy-distribution priors to constrain the range of likely far-infrared fluxes for each source that is detected by Hubble. The far-IR images are then segmented and the regions which are likely to suffer the most confusion are simultaneously fit, using the (broad) constraints on the likely farIR fluxes as a Bayesian prior. The first pass of photometry will yield reliable photometry for sources at least a factor of two fainter than existing catalogs. Subsequent passes can yield full probability distributions for the ensemble Far-IR SEDs of much fainter sources (overcoming some of the limitations of stacking in image space). We will used the improved and deeper FIR photometry to address two "crises" in reconciling galaxy evolution models with high-z galaxy observations: (1) the surprisingly young ages of most bright Lyman-break galaxies at redshift z=3 and (2) the surprisingly high star-formation rates and dust masses high-redshift sub-mm and FIR-selected galaxies. The former could potentially be explained if many of the descendants of UVbright galaxies at z=4 have too much dust by z=3 to be included in Lyman-break samples. The latter problem could be resolved if the fluxes of many FIR and sub-mm selected galaxies are affected by blending. The project will employ state-of-the art semi-analytical models for galaxy evolution, both for guidance in developing flexible Bayesian priors, and for guidance on the interpretation of the results. As part of the work we plan to further test and improve the treatment of dust in these models.
Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.
Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul
2015-01-01
As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.
NASA Astrophysics Data System (ADS)
Barbera, Roberto; Bruno, Riccardo; Calanducci, Antonio; Messina, Antonio; Pappalardo, Marco; Passaro, Gianluca
2013-04-01
The EarthServer project (www.earthserver.eu), funded by the European Commission under its Seventh Framework Program, aims at establishing open access and ad-hoc analytics on extreme-size Earth Science data, based on and extending leading-edge Array Database technology. The core idea is to use database query languages as client/server interface to achieve barrier-free "mix & match" access to multi-source, any-size, multi-dimensional space-time data -- in short: "Big Earth Data Analytics" - based on the open standards of the Open Geospatial Consortium Web Coverage Processing Service (OGC WCPS) and the W3C XQuery. EarthServer combines both, thereby achieving a tight data/metadata integration. Further, the rasdaman Array Database System (www.rasdaman.com) is extended with further space-time coverage data types. On server side, highly effective optimizations - such as parallel and distributed query processing - ensure scalability to Exabyte volumes. Six Lighthouse Applications are being established in EarthServer, each of which poses distinct challenges on Earth Data Analytics: Cryospheric Science, Airborne Science, Atmospheric Science, Geology, Oceanography, and Planetary Science. Altogether, they cover all Earth Science domains; the Planetary Science use case has been added to challenge concepts and standards in non-standard environments. In addition, EarthLook (maintained by Jacobs University) showcases use of OGC standards in 1D through 5D use cases. In this contribution we will report on the first applications integrated in the EarthServer Science Gateway and on the clients for mobile appliances developed to access them. We will also show how federated and social identity services can allow Big Earth Data Providers to expose their data in a distributed environment keeping a strict and fine-grained control on user authentication and authorisation. The degree of fulfilment of the EarthServer implementation with the recommendations made in the recent TERENA Study on AAA Platforms For Scientific Resources in Europe (https://confluence.terena.org/display/aaastudy/AAA+Study+Home+Page) will also be assessed.
Laycock, Alison; Bailie, Jodie; Matthews, Veronica; Cunningham, Frances; Harvey, Gillian; Percival, Nikki; Bailie, Ross
2017-01-01
Introduction Bringing together continuous quality improvement (CQI) data from multiple health services offers opportunities to identify common improvement priorities and to develop interventions at various system levels to achieve large-scale improvement in care. An important principle of CQI is practitioner participation in interpreting data and planning evidence-based change. This study will contribute knowledge about engaging diverse stakeholders in collaborative and theoretically informed processes to identify and address priority evidence-practice gaps in care delivery. This paper describes a developmental evaluation to support and refine a novel interactive dissemination project using aggregated CQI data from Aboriginal and Torres Strait Islander primary healthcare centres in Australia. The project aims to effect multilevel system improvement in Aboriginal and Torres Strait Islander primary healthcare. Methods and analysis Data will be gathered using document analysis, online surveys, interviews with participants and iterative analytical processes with the research team. These methods will enable real-time feedback to guide refinements to the design, reports, tools and processes as the interactive dissemination project is implemented. Qualitative data from interviews and surveys will be analysed and interpreted to provide in-depth understanding of factors that influence engagement and stakeholder perspectives about use of the aggregated data and generated improvement strategies. Sources of data will be triangulated to build up a comprehensive, contextualised perspective and integrated understanding of the project's development, implementation and findings. Ethics and dissemination The Human Research Ethics Committee (HREC) of the Northern Territory Department of Health and Menzies School of Health Research (Project 2015-2329), the Central Australian HREC (Project 15-288) and the Charles Darwin University HREC (Project H15030) approved the study. Dissemination will include articles in peer-reviewed journals, policy and research briefs. Results will be presented at conferences and quality improvement network meetings. Researchers, clinicians, policymakers and managers developing evidence-based system and policy interventions should benefit from this research. PMID:28710222
Instructional Implications of Inquiry in Reading Comprehension.
ERIC Educational Resources Information Center
Snow, David
A contract deliverable on the NIE Communication Skills Project, this report consists of three separate documents describing the instructional implications of the analytic and empirical work carried out for the "Classroom Instruction in Reading Comprehension" part of the project: (1) Guidelines for Phrasal Segmentation; (2) Parsing Tasks…
Philosophy Pursued through Empirical Research: Introduction to the Special Issue
ERIC Educational Resources Information Center
Wilson, Terri S.; Santoro, Doris A.
2015-01-01
Many scholars have pursued philosophical inquiry through empirical research. These empirical projects have been shaped--to varying degrees and in different ways--by philosophical questions, traditions, frameworks and analytic approaches. This issue explores the methodological challenges and opportunities involved in these kinds of projects. In…
efficiency and renewable energy projects. His patent on the Renewable Energy Optimization (REO) method of distribution function for time-series simulation Analytical and numerical optimization Project delivery with System Operations and Maintenance: 2nd Edition, 2016, NREL/Sandia/Sunspec Alliance SuNLaMP PV O&M
SELECTION AND TREATMENT OF STRIPPER GAS WELLS FOR PRODUCTION ENHANCEMENT IN THE MID-CONTINENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott Reeves
2003-03-01
Stripper gas wells are an important source of domestic energy supply and under constant threat of permanent loss (shut-in) due to marginal economics. In 1998, 192 thousand stripper gas wells produced over a Tcf of gas, at an average rate of less than 16 Mcfd. This represents about 57% of all producing gas wells in the onshore lower-48 states, yet only 8% of production. Reserves of stripper gas wells are estimated to be only 1.6 Tcf, or slightly over 1% of the onshore lower-48 total (end of year 1996 data). Obviously, stripper gas wells are at the very margin ofmore » economic sustenance. As the demand for natural gas in the U.S. grows to the forecasted estimate of over 30 Tcf annually by the year 2010, supply from current conventional sources is expected to decline. Therefore, an important need exists to fully exploit known domestic resources of natural gas, including those represented by stripper gas wells. The overall objectives of this project are to develop an efficient and low-cost methodology to broadly categorize the well performance characteristics for a stripper gas field, identify the high-potential candidate wells for remediation, and diagnose the specific causes for well underperformance. With this capability, stripper gas well operators can more efficiently and economically produce these resources and maximize these gas reserves. A further objective is to identify/develop, evaluate and test ''new and novel,'' economically viable remediation options. Finally, it is the objective of this project that all the methods and technologies developed in this project, while being tested in the Mid-Continent, be widely applicable to stripper gas wells of all types across the country. The project activities during the reporting period were: (1) Type curve matching continued during the reporting period. (2) A second data collection trip to Tulsa was performed to gather information on the additional reservoirs to be included in the analysis. Created updated database. Delivered information for both type-curve and artificial neural network analysis to analytic team. (3) Made presentations on the project at the Stripper Well Consortium Meetings in Oklahoma City (October 24) and Dallas (October 25). (4) Made presentations on the project at the PTTC Marginal Well workshop in Jackson (October 30) and Wichita (November 29).« less
Theoretical studies of tone noise from a fan rotor
NASA Technical Reports Server (NTRS)
Rao, G. V. R.; Chu, W. T.; Digumarthi, R. V.
1973-01-01
An analytical study was made of some possible rotor alone noise sources of dipole, quadrapole and monopole characters which generate discrete tone noise. Particular emphasis is given to the tone noise caused by fan inlet flow distortion and turbulence. Analytical models are developed to allow prediction of absolute levels. Experimental data measured on a small scale fan is presented which indicates inlet turbulence interaction with a fan rotor can be a source of tone noise. Predicted and measured tone noise for the small scale rotor are shown to be in reasonable agreement.
Analytic model for low-frequency noise in nanorod devices.
Lee, Jungil; Yu, Byung Yong; Han, Ilki; Choi, Kyoung Jin; Ghibaudo, Gerard
2008-10-01
In this work analytic model for generation of excess low-frequency noise in nanorod devices such as field-effect transistors are developed. In back-gate field-effect transistors where most of the surface area of the nanorod is exposed to the ambient, the surface states could be the major noise source via random walk of electrons for the low-frequency or 1/f noise. In dual gate transistors, the interface states and oxide traps can compete with each other as the main noise source via random walk and tunneling, respectively.
ERIC Educational Resources Information Center
Cooper, Bruce S.
The purpose of this study is threefold: to recount the history of the Anacostia Community School Project (later renamed the Response to Educational Needs Project) in Washington, D.C. between 1967 and 1978; to analyze the events of the period in light of theories of historiographic and social scientific developments; and to provide lessons from the…
Roger D. Fight; R. James Barbour; Glenn Christensen; Guy L. Pinjuv; Rao V. Nagubadi
2004-01-01
This work was undertaken under a joint fire science project "Assessing the need, costs, and potential benefits of prescribed fire and mechanical treatments to reduce fire hazard." This paper compares the future mix of timber projects under two treatment scenarios for New Mexico.We developed and demonstrated an analytical method that uses readily available...
Teaching Discrete Mathematics Entirely from Primary Historical Sources
ERIC Educational Resources Information Center
Barnett, Janet Heine; Bezhanishvili, Guram; Lodder, Jerry; Pengelley, David
2016-01-01
We describe teaching an introductory discrete mathematics course entirely from student projects based on primary historical sources. We present case studies of four projects that cover the content of a one-semester course, and mention various other courses that we have taught with primary source projects.
Galactic wind X-ray heating of the intergalactic medium during the Epoch of Reionization
NASA Astrophysics Data System (ADS)
Meiksin, Avery; Khochfar, Sadegh; Paardekooper, Jan-Pieter; Dalla Vecchia, Claudio; Kohn, Saul
2017-11-01
The diffuse soft X-ray emissivity from galactic winds is computed during the Epoch of Reionization (EoR). We consider two analytic models, a pressure-driven wind and a superbubble model, and a 3D cosmological simulation including gas dynamics from the First Billion Years (FiBY) project. The analytic models are normalized to match the diffuse X-ray emissivity of star-forming galaxies in the nearby Universe. The cosmological simulation uses physically motivated star formation and wind prescriptions, and includes radiative transfer corrections. The models and the simulation all are found to produce sufficient heating of the intergalactic medium to be detectable by current and planned radio facilities through 21 cm measurements during the EoR. While the analytic models predict a 21 cm emission signal relative to the cosmic microwave backgroundsets in by ztrans ≃ 8-10, the predicted signal in the FiBY simulation remains in absorption until reionization completes. The 21 cm absorption differential brightness temperature reaches a minimum of ΔT ≃ -130 to -200 mK, depending on model. Allowing for additional heat from high-mass X-ray binaries pushes the transition to emission to ztrans ≃ 10-12, with shallower absorption signatures having a minimum of ΔT ≃ -110 to -140 mK. The 21 cm signal may be a means of distinguishing between the wind models, with the superbubble model favouring earlier reheating. While an early transition to emission may indicate X-ray binaries dominate the reheating, a transition to emission as early as ztrans > 12 would suggest the presence of additional heat sources.
Multivariate optical element platform for compressed detection of fluorescence markers
NASA Astrophysics Data System (ADS)
Priore, Ryan J.; Swanstrom, Joseph A.
2014-05-01
The success of a commercial fluorescent diagnostic assay is dependent on the selection of a fluorescent biomarker; due to the broad nature of fluorescence biomarker emission profiles, only a small number of fluorescence biomarkers may be discriminated from each other as a function of excitation source. Multivariate Optical Elements (MOEs) are thin-film devices that encode a broad band, spectroscopic pattern allowing a simple broadband detector to generate a highly sensitive and specific detection for a target analyte. MOEs have historically been matched 1:1 to a discrete analyte or class prediction; however, MOE filter sets are capable of sensing projections of the original sparse spectroscopic space enabling a small set of MOEs to discriminate a multitude of target analytes. This optical regression can offer real-time measurements with relatively high signal-to-noise ratios that realize the advantages of multiplexed detection and pattern recognition in a simple optical instrument. The specificity advantage of MOE-based sensors allows fluorescent biomarkers that were once incapable of discrimination from one another via optical band pass filters to be employed in a common assay panel. A simplified MOE-based sensor may ultimately reduce the requirement for highly trained operators as well as move certain life science applications like disease prognostication from the laboratory to the point of care. This presentation will summarize the design and fabrication of compressed detection MOE filter sets for detecting multiple fluorescent biomarkers simultaneously with strong spectroscopic interference as well as comparing the detection performance of the MOE sensor with traditional optical band pass filter methodologies.
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China.
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li'an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-03-01
A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box-Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China.
Nationwide Multicenter Reference Interval Study for 28 Common Biochemical Analytes in China
Xia, Liangyu; Chen, Ming; Liu, Min; Tao, Zhihua; Li, Shijun; Wang, Liang; Cheng, Xinqi; Qin, Xuzhen; Han, Jianhua; Li, Pengchang; Hou, Li’an; Yu, Songlin; Ichihara, Kiyoshi; Qiu, Ling
2016-01-01
Abstract A nationwide multicenter study was conducted in the China to explore sources of variation of reference values and establish reference intervals for 28 common biochemical analytes, as a part of the International Federation of Clinical Chemistry and Laboratory Medicine, Committee on Reference Intervals and Decision Limits (IFCC/C-RIDL) global study on reference values. A total of 3148 apparently healthy volunteers were recruited in 6 cities covering a wide area in China. Blood samples were tested in 2 central laboratories using Beckman Coulter AU5800 chemistry analyzers. Certified reference materials and value-assigned serum panel were used for standardization of test results. Multiple regression analysis was performed to explore sources of variation. Need for partition of reference intervals was evaluated based on 3-level nested ANOVA. After secondary exclusion using the latent abnormal values exclusion method, reference intervals were derived by a parametric method using the modified Box–Cox formula. Test results of 20 analytes were made traceable to reference measurement procedures. By the ANOVA, significant sex-related and age-related differences were observed in 12 and 12 analytes, respectively. A small regional difference was observed in the results for albumin, glucose, and sodium. Multiple regression analysis revealed BMI-related changes in results of 9 analytes for man and 6 for woman. Reference intervals of 28 analytes were computed with 17 analytes partitioned by sex and/or age. In conclusion, reference intervals of 28 common chemistry analytes applicable to Chinese Han population were established by use of the latest methodology. Reference intervals of 20 analytes traceable to reference measurement procedures can be used as common reference intervals, whereas others can be used as the assay system-specific reference intervals in China. PMID:26945390
Analytical measurements of fission products during a severe nuclear accident
NASA Astrophysics Data System (ADS)
Doizi, D.; Reymond la Ruinaz, S.; Haykal, I.; Manceron, L.; Perrin, A.; Boudon, V.; Vander Auwera, J.; tchana, F. Kwabia; Faye, M.
2018-01-01
The Fukushima accident emphasized the fact that ways to monitor in real time the evolution of a nuclear reactor during a severe accident remain to be developed. No fission products were monitored during twelve days; only dose rates were measured, which is not sufficient to carry out an online diagnosis of the event. The first measurements were announced with little reliability for low volatile fission products. In order to improve the safety of nuclear plants and minimize the industrial, ecological and health consequences of a severe accident, it is necessary to develop new reliable measurement systems, operating at the earliest and closest to the emission source of fission products. Through the French program ANR « Projet d'Investissement d'Avenir », the aim of the DECA-PF project (diagnosis of core degradation from fission products measurements) is to monitor in real time the release of the major fission products (krypton, xenon, gaseous forms of iodine and ruthenium) outside the nuclear reactor containment. These products are released at different times during a nuclear accident and at different states of the nuclear core degradation. Thus, monitoring these fission products gives information on the situation inside the containment and helps to apply the Severe Accident Management procedures. Analytical techniques have been proposed and evaluated. The results are discussed here.
Simulation supported POD for RT test case-concept and modeling
NASA Astrophysics Data System (ADS)
Gollwitzer, C.; Bellon, C.; Deresch, A.; Ewert, U.; Jaenisch, G.-R.; Zscherpel, U.; Mistral, Q.
2012-05-01
Within the framework of the European project PICASSO, the radiographic simulator aRTist (analytical Radiographic Testing inspection simulation tool) developed by BAM has been extended for reliability assessment of film and digital radiography. NDT of safety relevant components of aerospace industry requires the proof of probability of detection (POD) of the inspection. Modeling tools can reduce the expense of such extended, time consuming NDT trials, if the result of simulation fits to the experiment. Our analytic simulation tool consists of three modules for the description of the radiation source, the interaction of radiation with test pieces and flaws, and the detection process with special focus on film and digital industrial radiography. It features high processing speed with near-interactive frame rates and a high level of realism. A concept has been developed as well as a software extension for reliability investigations, completed by a user interface for planning automatic simulations with varying parameters and defects. Furthermore, an automatic image analysis procedure is included to evaluate the defect visibility. The radiographic modeling from 3D CAD of aero engine components and quality test samples are compared as a precondition for real trials. This enables the evaluation and optimization of film replacement for application of modern digital equipment for economical NDT and defined POD.
NHEXAS PHASE I ARIZONA STUDY--METALS IN WATER ANALYTICAL RESULTS
The Metals in Water data set contains analytical results for measurements of up to 11 metals in 314 water samples over 211 households. Sample collection was undertaken at the tap and any additional drinking water source used extensively within each residence. The primary metals...
Denitrification is a significant process for the removal of nitrate transported in groundwater drainage from agricultural watersheds. In this paper analytical solutions are developed for advective-reactive and nonpoint-source contaminant transport in a two-layer unconfined aquife...
100-N Area Decision Unit Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ovink, R.
2012-09-18
This report documents the process used to identify source area target analytes in support of the 100-N Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).
Hens, Koen; Berth, Mario; Armbruster, Dave; Westgard, Sten
2014-07-01
Six Sigma metrics were used to assess the analytical quality of automated clinical chemistry and immunoassay tests in a large Belgian clinical laboratory and to explore the importance of the source used for estimation of the allowable total error. Clinical laboratories are continually challenged to maintain analytical quality. However, it is difficult to measure assay quality objectively and quantitatively. The Sigma metric is a single number that estimates quality based on the traditional parameters used in the clinical laboratory: allowable total error (TEa), precision and bias. In this study, Sigma metrics were calculated for 41 clinical chemistry assays for serum and urine on five ARCHITECT c16000 chemistry analyzers. Controls at two analyte concentrations were tested and Sigma metrics were calculated using three different TEa targets (Ricos biological variability, CLIA, and RiliBÄK). Sigma metrics varied with analyte concentration, the TEa target, and between/among analyzers. Sigma values identified those assays that are analytically robust and require minimal quality control rules and those that exhibit more variability and require more complex rules. The analyzer to analyzer variability was assessed on the basis of Sigma metrics. Six Sigma is a more efficient way to control quality, but the lack of TEa targets for many analytes and the sometimes inconsistent TEa targets from different sources are important variables for the interpretation and the application of Sigma metrics in a routine clinical laboratory. Sigma metrics are a valuable means of comparing the analytical quality of two or more analyzers to ensure the comparability of patient test results.