Pilot testing of SHRP 2 reliability data and analytical products: Florida. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
SHRP 2 initiated the L38 project to pilot test products from five of the programs completed projects. The products support reliability estimation and use based on data analyses, analytical techniques, and decision-making framework. The L38 project...
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
1995-01-01
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
Hanford analytical sample projections FY 1998--FY 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, S.M.
1998-02-12
Analytical Services projections are compiled for the Hanford site based on inputs from the major programs for the years 1998 through 2002. Projections are categorized by radiation level, protocol, sample matrix and program. Analyses requirements are also presented. This document summarizes the Hanford sample projections for fiscal years 1998 to 2002. Sample projections are based on inputs submitted to Analytical Services covering Environmental Restoration, Tank Waste Remediation Systems (TWRS), Solid Waste, Liquid Effluents, Spent Nuclear Fuels, Transition Projects, Site Monitoring, Industrial Hygiene, Analytical Services and miscellaneous Hanford support activities. In addition, details on laboratory scale technology (development) work, Sample Management,more » and Data Management activities are included. This information will be used by Hanford Analytical Services (HAS) and the Sample Management Working Group (SMWG) to assure that laboratories and resources are available and effectively utilized to meet these documented needs.« less
The Human is the Loop: New Directions for Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Hossain, Shahriar H.; Ramakrishnan, Naren
2014-01-28
Visual analytics is the science of marrying interactive visualizations and analytic algorithms to support exploratory knowledge discovery in large datasets. We argue for a shift from a ‘human in the loop’ philosophy for visual analytics to a ‘human is the loop’ viewpoint, where the focus is on recognizing analysts’ work processes, and seamlessly fitting analytics into that existing interactive process. We survey a range of projects that provide visual analytic support contextually in the sensemaking loop, and outline a research agenda along with future challenges.
DOT National Transportation Integrated Search
2009-12-01
The goals of integration should be: Supporting domain oriented data analysis through the use of : knowledge augmented visual analytics system. In this project, we focus on: : Providing interactive data exploration for bridge managements. : ...
Wang, Jun-Wen; Liu, Yang; Tong, Yuan-Yuan; Yang, Ce; Li, Hai-Yan
2016-05-01
This study collected 1995-2014 molecular pharmacognosy study, a total of 595 items, funded by Natural Science Foundation of China (NSFC). TDA and Excel software were used to analyze the data of the projects about general situation, hot spots of research with rank analytic and correlation analytic methods. Supported by NSFC molecular pharmacognosy projects and funding a gradual increase in the number of, the proportion of funds for pharmaceutical research funding tends to be stable; mainly supported by molecular biology methods of genuine medicinal materials, secondary metabolism and Germplasm Resources Research; hot drugs including Radix Salviae Miltiorrhizae, Radix Rehmanniae, Cordyceps sinensis, hot contents including tanshinone biosynthesis, Rehmannia glutinosa continuous cropping obstacle. Copyright© by the Chinese Pharmaceutical Association.
2010-04-01
analytical community. 5.1 Towards a Common Understanding of CD&E and CD&E Project Management Recent developments within NATO have contributed to the... project management purposes it is useful to distinguish four phases [P 21]: a) Preparation, Initiation and Structuring; b) Concept Development Planning...examined in more detail below. While the NATO CD&E policy provides a benchmark for a comprehensive, disciplined management of CD&E projects , it may
Measuring research progress in photovoltaics
NASA Technical Reports Server (NTRS)
Jackson, B.; Mcguire, P.
1986-01-01
The role and some results of the project analysis and integration function in the Flat-plate Solar Array (FSA) Project are presented. Activities included supporting the decision-making process, preparation of plans for project direction, setting goals for project activities, measuring progress within the project, and the development and maintenance of analytical models.
Pilot testing of SHRP 2 reliability data and analytical products: Washington. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The Washington site used the reliability guide from Project L02, analysis tools for forecasting reliability and estimating impacts from Project L07, Project L08, and Project C11 as well as the guide on reliability performance measures from the Projec...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less
Analytical Chemistry Laboratory
NASA Technical Reports Server (NTRS)
Anderson, Mark
2013-01-01
The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.
DOT National Transportation Integrated Search
2012-11-30
The objective of this project was to develop technical relationships between reliability improvement strategies and reliability performance metrics. This project defined reliability, explained the importance of travel time distributions for measuring...
CEDS Addresses: Rubric Elements
ERIC Educational Resources Information Center
US Department of Education, 2015
2015-01-01
Common Education Data Standards (CEDS) Version 4 introduced a common data vocabulary for defining rubrics in a data system. The CEDS elements support digital representations of both holistic and analytic rubrics. This document shares examples of holistic and analytic project rubrics, available CEDS Connections, and a logical model showing the…
Validation of urban freeway models. [supporting datasets
DOT National Transportation Integrated Search
2015-01-01
The goal of the SHRP 2 Project L33 Validation of Urban Freeway Models was to assess and enhance the predictive travel time reliability models developed in the SHRP 2 Project L03, Analytic Procedures for Determining the Impacts of Reliability Mitigati...
ERIC Educational Resources Information Center
Bobronnikov, Ellen; Rhodes, Hilary; Bradley, Cay
2010-01-01
This final report culminates the evaluation and technical assistance provided for the U.S. Department of Education's Mathematics and Science Partnership (MSP) Program and its projects since 2005. As part of this support, Abt Associates looked across the portfolio of projects funded by the MSP program to draw lessons on best practices. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less
NASA Technical Reports Server (NTRS)
Tavana, Madjid; Lee, Seunghee
1996-01-01
Objective evaluation and prioritization of engineering support requests (ESRs) is a difficult task at the Kennedy Space Center (KSC) Shuttle Project Engineering Office. The difficulty arises from the complexities inherent in the evaluation process and the lack of structured information. The purpose of this project is to implement the consensus ranking organizational support system (CROSS), a multiple criteria decision support system (DSS) developed at KSC that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. CROSS utilizes the analytic hierarchy process (AHP), subjective probabilities, entropy concept, and maximize agreement heuristic (MAH) to enhance the decision maker's intuition in evaluation ESRs. Some of the preliminary goals of the project are to: (1) revisit the structure of the ground systems working team (GWST) steering committee, (2) develop a template for ESR originators to provide more comple and consistent information to the GSWT steering committee members to eliminate the need for a facilitator, (3) develop an objective and structured process for the initial screening of ESRs, (4) extensive training of the stakeholders and the GWST steering committee to eliminate the need for a facilitator, (5) automate the process as much as possible, (6) create an environment to compile project success factor data on ESRs and move towards a disciplined system that could be used to address supportability threshold issues at the KSC, and (7) investigate the possibility of an organization-wide implementation of CROSS.
Climate Data Analytics Workflow Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.
2016-12-01
In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
Assisting Instructional Assessment of Undergraduate Collaborative Wiki and SVN Activities
ERIC Educational Resources Information Center
Kim, Jihie; Shaw, Erin; Xu, Hao; Adarsh, G. V.
2012-01-01
In this paper we examine the collaborative performance of undergraduate engineering students who used shared project documents (Wikis, Google documents) and a software version control system (SVN) to support project collaboration. We present an initial implementation of TeamAnalytics, an instructional tool that facilitates the analyses of the…
NASA Astrophysics Data System (ADS)
Zhong, Xian-Qiong; Zhang, Xiao-Xia; Du, Xian-Tong; Liu, Yong; Cheng, Ke
2015-10-01
The approximate analytical frequency chirps and the critical distances for cross-phase modulation induced optical wave breaking (OWB) of the initial hyperbolic-secant optical pulses propagating in optical fibers with quintic nonlinearity (QN) are presented. The pulse evolutions in terms of the frequency chirps, shapes and spectra are numerically calculated in the normal dispersion regime. The results reveal that, depending on different QN parameters, the traditional OWB or soliton or soliton pulse trains may occur. The approximate analytical critical distances are found to be in good agreement with the numerical ones only for the traditional OWB whereas the approximate analytical frequency chirps accords well with the numerical ones at the initial evolution stages of the pulses. Supported by the Postdoctoral Fund of China under Grant No. 2011M501402, the Key Project of Chinese Ministry of Education under Grant No. 210186, the Major Project of Natural Science Supported by the Educational Department of Sichuan Province under Grant No. 13ZA0081, the Key Project of National Natural Science Foundation of China under Grant No 61435010, and the National Natural Science Foundation of China under Grant No. 61275039
DOE Office of Scientific and Technical Information (OSTI.GOV)
none, none; Tuchman, Nancy
The U.S. Department of Energy awarded Loyola University Chicago and the Institute of Environmental Sustainability (IES) $486,000.00 for the proposal entitled “Chicago clean air, clean water project: Environmental monitoring for a healthy, sustainable urban future.” The project supported the purchase of analytical instruments for the development of an environmental analytical laboratory. The analytical laboratory is designed to support the testing of field water and soil samples for nutrients, industrial pollutants, heavy metals, and agricultural toxins, with special emphasis on testing Chicago regional soils and water affected by coal-based industry. Since the award was made in 2010, the IES has beenmore » launched (fall 2013), and the IES acquired a new state-of-the-art research and education facility on Loyola University Chicago’s Lakeshore campus. Two labs were included in the research and education facility. The second floor lab is the Ecology Laboratory where lab experiments and analyses are conducted on soil, plant, and water samples. The third floor lab is the Environmental Toxicology Lab where lab experiments on environmental toxins are conducted, as well as analytical tests conducted on water, soil, and plants. On the south end of the Environmental Toxicology Lab is the analytical instrumentation collection purchased from the present DOE grant, which is overseen by a full time Analytical Chemist (hired January 2016), who maintains the instruments, conducts analyses on samples, and helps to train faculty and undergraduate and graduate student researchers.« less
The evaluator as technical assistant: A model for systemic reform support
NASA Astrophysics Data System (ADS)
Century, Jeanne Rose
This study explored evaluation of systemic reform. Specifically, it focused on the evaluation of a systemic effort to improve K-8 science, mathematics and technology education. The evaluation was of particular interest because it used both technical assistance and evaluation strategies. Through studying the combination of these roles, this investigation set out to increase understanding of potentially new evaluator roles, distinguish important characteristics of the evaluator/project participant relationship, and identify how these roles and characteristics contribute to effective evaluation of systemic science education reform. This qualitative study used interview, document analysis, and participant observation as methods of data collection. Interviews were conducted with project leaders, project participants, and evaluators and focused on the evaluation strategies and process, the use of the evaluation, and technical assistance. Documents analyzed included transcripts of evaluation team meetings and reports, memoranda and other print materials generated by the project leaders and the evaluators. Data analysis consisted of analytic and interpretive procedures consistent with the qualitative data collected and entailed a combined process of coding transcripts of interviews and meetings, field notes, and other documents; analyzing and organizing findings; writing of reflective and analytic memos; and designing and diagramming conceptual relationships. The data analysis resulted in the development of the Multi-Function Model for Systemic Reform Support. This model organizes systemic reform support into three functions: evaluation, technical assistance, and a third, named here as "systemic perspective." These functions work together to support the project's educational goals as well as a larger goal--building capacity in project participants. This model can now serve as an informed starting point or "blueprint" for strategically supporting systemic reform.
Analytical model of tilted driver–pickup coils for eddy current nondestructive evaluation
NASA Astrophysics Data System (ADS)
Cao, Bing-Hua; Li, Chao; Fan, Meng-Bao; Ye, Bo; Tian, Gui-Yun
2018-03-01
A driver-pickup probe possesses better sensitivity and flexibility due to individual optimization of a coil. It is frequently observed in an eddy current (EC) array probe. In this work, a tilted non-coaxial driver-pickup probe above a multilayered conducting plate is analytically modeled with spatial transformation for eddy current nondestructive evaluation. Basically, the core of the formulation is to obtain the projection of magnetic vector potential (MVP) from the driver coil onto the vector along the tilted pickup coil, which is divided into two key steps. The first step is to make a projection of MVP along the pickup coil onto a horizontal plane, and the second one is to build the relationship between the projected MVP and the MVP along the driver coil. Afterwards, an analytical model for the case of a layered plate is established with the reflection and transmission theory of electromagnetic fields. The calculated values from the resulting model indicate good agreement with those from the finite element model (FEM) and experiments, which validates the developed analytical model. Project supported by the National Natural Science Foundation of China (Grant Nos. 61701500, 51677187, and 51465024).
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
The Independent Technical Analysis Process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.
2007-04-13
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.
A Tool Supporting Collaborative Data Analytics Workflow Design and Management
NASA Astrophysics Data System (ADS)
Zhang, J.; Bao, Q.; Lee, T. J.
2016-12-01
Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.
Analytic innovations for air quality modeling
The presentation provides an overview of ongoing research activities at the U.S. EPA, focusing on improving long-term emission projections and the development of decision support systems for coordinated environmental, climate and energy planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barber, F.H.; Borek, T.T.; Christopher, J.Z.
1997-12-01
Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
CAA Annual Report Fiscal Year 1998.
1998-12-01
Studies , 3-1 Quick Reaction Analyses & Projects 3-1 4 TECHNOLOGY RESEARCH AND ANALYSIS SUPPORT 4-1 Technology Research 4-1 Methodology Research 4-2...Publications, Graphics, and Reproduction 5-2 6 ANALYTICAL EFFORTS COMPLETED BETWEEN FY90 AND FY98 6-1 Appendix A Annual Study , Work Evaluation...future. Chapter 2 highlights major studies and analysis activities which occurred in FY 98. Chapter 3 is the total package of analytical summaries
Cost and schedule analytical techniques development
NASA Technical Reports Server (NTRS)
1994-01-01
This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.
Flat-plate solar array project. Volume 8: Project analysis and integration
NASA Technical Reports Server (NTRS)
Mcguire, P.; Henry, P.
1986-01-01
Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, A.G.
The Pacific Northwest Laboratory (PNL)/Analytical Chemistry Laboratory (ACL) and the Westinghouse Hanford Company (WHC)/Process Analytical Laboratory (PAL) provide analytical support services to various environmental restoration and waste management projects/programs at Hanford. In response to a US Department of Energy -- Richland Field Office (DOE-RL) audit, which questioned the comparability of analytical methods employed at each laboratory, the Sample Exchange/Exchange (SEE) program was initiated. The SEE Program is a selfassessment program designed to compare analytical methods of the PAL and ACL laboratories using sitespecific waste material. The SEE program is managed by a collaborative, the Quality Assurance Triad (Triad). Triad membershipmore » is made up of representatives from the WHC/PAL, PNL/ACL, and WHC Hanford Analytical Services Management (HASM) organizations. The Triad works together to design/evaluate/implement each phase of the SEE Program.« less
Quest for Missing Proteins: Update 2015 on Chromosome-Centric Human Proteome Project.
Horvatovich, Péter; Lundberg, Emma K; Chen, Yu-Ju; Sung, Ting-Yi; He, Fuchu; Nice, Edouard C; Goode, Robert J; Yu, Simon; Ranganathan, Shoba; Baker, Mark S; Domont, Gilberto B; Velasquez, Erika; Li, Dong; Liu, Siqi; Wang, Quanhui; He, Qing-Yu; Menon, Rajasree; Guan, Yuanfang; Corrales, Fernando J; Segura, Victor; Casal, J Ignacio; Pascual-Montano, Alberto; Albar, Juan P; Fuentes, Manuel; Gonzalez-Gonzalez, Maria; Diez, Paula; Ibarrola, Nieves; Degano, Rosa M; Mohammed, Yassene; Borchers, Christoph H; Urbani, Andrea; Soggiu, Alessio; Yamamoto, Tadashi; Salekdeh, Ghasem Hosseini; Archakov, Alexander; Ponomarenko, Elena; Lisitsa, Andrey; Lichti, Cheryl F; Mostovenko, Ekaterina; Kroes, Roger A; Rezeli, Melinda; Végvári, Ákos; Fehniger, Thomas E; Bischoff, Rainer; Vizcaíno, Juan Antonio; Deutsch, Eric W; Lane, Lydie; Nilsson, Carol L; Marko-Varga, György; Omenn, Gilbert S; Jeong, Seul-Ki; Lim, Jong-Sun; Paik, Young-Ki; Hancock, William S
2015-09-04
This paper summarizes the recent activities of the Chromosome-Centric Human Proteome Project (C-HPP) consortium, which develops new technologies to identify yet-to-be annotated proteins (termed "missing proteins") in biological samples that lack sufficient experimental evidence at the protein level for confident protein identification. The C-HPP also aims to identify new protein forms that may be caused by genetic variability, post-translational modifications, and alternative splicing. Proteogenomic data integration forms the basis of the C-HPP's activities; therefore, we have summarized some of the key approaches and their roles in the project. We present new analytical technologies that improve the chemical space and lower detection limits coupled to bioinformatics tools and some publicly available resources that can be used to improve data analysis or support the development of analytical assays. Most of this paper's content has been compiled from posters, slides, and discussions presented in the series of C-HPP workshops held during 2014. All data (posters, presentations) used are available at the C-HPP Wiki (http://c-hpp.webhosting.rug.nl/) and in the Supporting Information.
Evaluating child welfare policies with decision-analytic simulation models.
Goldhaber-Fiebert, Jeremy D; Bailey, Stephanie L; Hurlburt, Michael S; Zhang, Jinjin; Snowden, Lonnie R; Wulczyn, Fred; Landsverk, John; Horwitz, Sarah M
2012-11-01
The objective was to demonstrate decision-analytic modeling in support of Child Welfare policymakers considering implementing evidence-based interventions. Outcomes included permanency (e.g., adoptions) and stability (e.g., foster placement changes). Analyses of a randomized trial of KEEP-a foster parenting intervention-and NSCAW-1 estimated placement change rates and KEEP's effects. A microsimulation model generalized these findings to other Child Welfare systems. The model projected that KEEP could increase permanency and stability, identifying strategies targeting higher-risk children and geographical regions that achieve benefits efficiently. Decision-analytic models enable planners to gauge the value of potential implementations.
Hydrogen Safety Project: Chemical analysis support task. Window ``E`` analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T E; Campbell, J A; Hoppe, E W
1992-09-01
Core samples taken from tank 101-SY at Hanford during ``window E`` were analyzed for organic and radiochemical constituents by staff of the Analytical Chemistry Laboratory at Pacific Northwest Laboratory. Westinghouse Hanford company submitted these samples to the laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Heinrich, R.R.; Jensen, K.J.
Technical and administrative activities of the Analytical Chemistry Laboratory (ACL) are reported for fiscal year 1984. The ACL is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL is administratively within the Chemical Technology Division, the principal user, but provides technical support for all of the technical divisions and programs at ANL. The ACL has threemore » technical groups - Chemical Analysis, Instrumental Analysis, and Organic Analysis. Under technical activities 26 projects are briefly described. Under professional activities, a list is presented for publications and reports, oral presentations, awards and meetings attended. 6 figs., 2 tabs.« less
The Ophidia framework: toward cloud-based data analytics for climate change
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni
2015-04-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private cloud infrastructure of the CMCC Supercomputing Centre.
Analytical Chemistry Laboratory. Progress report for FY 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
DOT National Transportation Integrated Search
2014-01-01
The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...
Pilot testing of SHRP 2 reliability data and analytical products: Minnesota. [supporting datasets
DOT National Transportation Integrated Search
2014-01-01
The objective of this project was to develop system designs for programs to monitor travel time reliability and to prepare a guidebook that practitioners and others can use to design, build, operate, and maintain such systems. Generally, such travel ...
SWMM5 Application Programming Interface and PySWMM: A Python Interfacing Wrapper
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ...
ERIC Educational Resources Information Center
Stanford Univ., CA.
Recognizing the need to balance generality and economy in system costs, the Project INFO team at Stanford University developing OASIS has sought to provide generalized and powerful computer support within the normal range of operating and analytical requirements associated with university administration. The specific design objectives of the OASIS…
Cretini, K.F.; Steyer, G.D.
2011-01-01
The Coastwide Reference Monitoring System (CRMS) program was established to assess the effectiveness of individual coastal restoration projects and the cumulative effects of multiple projects at regional and coastwide scales. In order to make these assessments, analytical teams have been assembled for each of the primary data types sampled under the CRMS program, including vegetation, hydrology, landscape, and soils. These teams consist of scientists and support staff from the U.S. Geological Survey and other Federal agencies, the Louisiana Office of Coastal Protection and Restoration, and university academics. Each team is responsible for developing or identifying parameters, indices, or tools that can be used to assess coastal wetlands at various scales. The CRMS Vegetation Analytical Team has developed a Floristic Quality Index for coastal Louisiana to determine the quality of a wetland based on its plant species composition and abundance.
2010-04-01
available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim
NASA Astrophysics Data System (ADS)
Fiore, Sandro; Williams, Dean; Aloisio, Giovanni
2016-04-01
In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
University of Michigan Physics Department: E[superscript2]Coach
ERIC Educational Resources Information Center
EDUCAUSE, 2014
2014-01-01
The E[superscript 2]Coach project from the Department of Physics at the University of Michigan (UM) addresses the challenge of providing individual student support in high-enrollment introductory science courses. This web application employs tailored communications technology, course experiences, student data, and analytics to deliver customized…
DOT National Transportation Integrated Search
2005-02-01
The response of a concrete filled, steel pipe pile-to-concrete pile cap connection subjected to extreme lateral loads : was experimentally and analytically investigated in this project. This connection is part of a bridge support system : used by the...
Cumulative biological impacts framework for solar energy projects in the California Desert
Davis, Frank W.; Kreitler, Jason R.; Soong, Oliver; Stoms, David M.; Dashiell, Stephanie; Hannah, Lee; Wilkinson, Whitney; Dingman, John
2013-01-01
This project developed analytical approaches, tools and geospatial data to support conservation planning for renewable energy development in the California deserts. Research focused on geographical analysis to avoid, minimize and mitigate the cumulative biological effects of utility-scale solar energy development. A hierarchical logic model was created to map the compatibility of new solar energy projects with current biological conservation values. The research indicated that the extent of compatible areas is much greater than the estimated land area required to achieve 2040 greenhouse gas reduction goals. Species distribution models were produced for 65 animal and plant species that were of potential conservation significance to the Desert Renewable Energy Conservation Plan process. These models mapped historical and projected future habitat suitability using 270 meter resolution climate grids. The results were integrated into analytical frameworks to locate potential sites for offsetting project impacts and evaluating the cumulative effects of multiple solar energy projects. Examples applying these frameworks in the Western Mojave Desert ecoregion show the potential of these publicly-available tools to assist regional planning efforts. Results also highlight the necessity to explicitly consider projected land use change and climate change when prioritizing areas for conservation and mitigation offsets. Project data, software and model results are all available online.
Advanced Design Features of APR1400 and Realization in Shin Kori Construction Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
OH, S.J.; Park, K.C.; Kim, H.G.
2006-07-01
APR1400 adopted several advanced design features. To ensure their proper operation as a part of ShinKori 3,4 project, both experimental and analytical work are continuing. In this paper, work on the advanced design features related to enhanced safety is examined. APR1400 safety injection system consists of four independent trains which include four safety injection pump and tanks. A passive flow regulating device called fluidic device is installed in the safety injection tanks. Separate effect tests including a full scale fluidic device tests have been conducted. Integral system tests are in progress. Combination of these work with the analytical work usingmore » RELAP5/Mod3 would ensure the proper operation of the new safety injection systems. To mitigate severe accidents, hydrogen mitigation system using PARs and igniters is adopted. Also, active injection system and the streamlined insulation design are adopted to enhance the in-vessel retention capability with the external cooling of RPV strategy. Analytic work with supporting experiments is performed. We are certain that these preparatory work would help the successful adaptation of ADF in ShinKori project. (authors)« less
FY2017 Analysis Annual Progress Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
During fiscal year 2017, the U.S. Department of Energy Vehicle Technologies Office (VTO) funded analysis projects supportive of VTO’s goals to pursue early stage research in vehicle and mobility system technologies to reduce petroleum dependence, increase energy reliability and security, improve transportation affordability, and promote economic growth. VTO analysis projects result in a foundation of data, analytical models, and applied analyses that provide insights into critical transportation energy problems and assist in research investment prioritization and portfolio planning.
Technology to improve quality and accountability.
Kay, Jonathan
2006-01-01
A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.
NASA Astrophysics Data System (ADS)
Arsenault, Louis-Francois; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
We present a machine learning-based statistical regression approach to the inversion of Fredholm integrals of the first kind by studying an important example for the quantum materials community, the analytical continuation problem of quantum many-body physics. It involves reconstructing the frequency dependence of physical excitation spectra from data obtained at specific points in the complex frequency plane. The approach provides a natural regularization in cases where the inverse of the Fredholm kernel is ill-conditioned and yields robust error metrics. The stability of the forward problem permits the construction of a large database of input-output pairs. Machine learning methods applied to this database generate approximate solutions which are projected onto the subspace of functions satisfying relevant constraints. We show that for low input noise the method performs as well or better than Maximum Entropy (MaxEnt) under standard error metrics, and is substantially more robust to noise. We expect the methodology to be similarly effective for any problem involving a formally ill-conditioned inversion, provided that the forward problem can be efficiently solved. AJM was supported by the Office of Science of the U.S. Department of Energy under Subcontract No. 3F-3138 and LFA by the Columbia Univeristy IDS-ROADS project, UR009033-05 which also provided part support to RN and LH.
NASA Astrophysics Data System (ADS)
Xu, Yao; Zhang, Chun-Hui; Niebur, Ernst; Wang, Jun-Song
2018-04-01
Not Available Project supported by the National Natural Science Foundation of China (Grant No. 61473208), the Tianjin Research Program of Application Foundation and Advanced Technology, China (Grant No. 15JCYBJC47700), the National Institutes of Health, USA (Grant Nos. R01DA040990 and R01EY027544), and the Project of Humanities and Social Sciences from the Ministry of Education, China (Grant No. 17YJAZH092).
Phase retrieval with Fourier-weighted projections.
Guizar-Sicairos, Manuel; Fienup, James R
2008-03-01
In coherent lensless imaging, the presence of image sidelobes, which arise as a natural consequence of the finite nature of the detector array, was early recognized as a convergence issue for phase retrieval algorithms that rely on an object support constraint. To mitigate the problem of truncated far-field measurement, a controlled analytic continuation by means of an iterative transform algorithm with weighted projections is proposed and tested. This approach avoids the use of sidelobe reduction windows and achieves full-resolution reconstructions.
Management Support Services. Volume 2.
1983-10-01
ANALYTICS, IN1C PR:CJECT ,TITLE: COSTKA!ALTSIS PROJECT NUMBER: DRCFM-; 0002 C.OTIC NUMBER :(OT REPORTEDI)0 NARRATIVE: THEFCONTRACTOR WILL PROVIDE COST...ARNY RESERVE PLBLIC 66ARENESS ACTIVITY a R401 ON-GCIAG 8110 9505 2 B ONAR LOR NOR THAN ONE SEE INSIDE BACK COVER FOR KEY TO SYNBOLS 3 6 (. ri
Linking climate change and fish conservation efforts using spatially explicit decision support tools
Douglas P. Peterson; Seth J. Wenger; Bruce E. Rieman; Daniel J. Isaak
2013-01-01
Fisheries professionals are increasingly tasked with incorporating climate change projections into their decisions. Here we demonstrate how a structured decision framework, coupled with analytical tools and spatial data sets, can help integrate climate and biological information to evaluate management alternatives. We present examples that link downscaled climate...
ERIC Educational Resources Information Center
Haneda, Mari; Teemant, Annela; Sherman, Brandon
2017-01-01
We investigate the instructional coaching interactions between a kindergarten teacher and an experienced coach using the analytic lens of dialogic teaching. The data were collected in the context of a US professional development project that supports urban elementary school teachers in enacting critical sociocultural teaching practices. We…
DOT National Transportation Integrated Search
1998-06-01
The response of a concrete filled, steel pipe pile-to-concrete pile cap connection subjected to extreme lateral loads was experimentally and analytically investigated in this project. This connection is part of a bridge support system used by the Mon...
Solid waste projection model: Model version 1. 0 technical reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkins, M.L.; Crow, V.L.; Buska, D.E.
1990-11-01
The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software utilized in developing Version 1.0 of the modeling unit of SWPM. This document is intended for use by experienced software engineers and supports programming, code maintenance, and model enhancement. Those interested in using SWPM should refer to the SWPM Modelmore » User's Guide. This document is available from either the PNL project manager (D. L. Stiles, 509-376-4154) or the WHC program monitor (B. C. Anderson, 509-373-2796). 8 figs.« less
The PANDA tests for SBWR certification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varadi, G.; Dreier, J.; Bandurski, Th.
1996-03-01
The ALPHA project is centered around the experimental and analytical investigation of the long-term decay heat removal from the containments of the next generation of {open_quotes}passive{close_quotes} ALWRs. The project includes integral system tests in the large-scale (1:25 in volume) PANDA facility as well as several other series of tests and supporting analytical work. The first series of experiments to be conducted in PANDA have become a required experimental element in the certification process for the General Electric Simplified Boiling Water Reactor (SBWR). The PANDA general experimental philosophy, facility design, scaling, and instrumentation are described. Steady-state PCCS condenser performance tests andmore » extensive facility characterization tests were already conducted. The transient system behavior tests are underway; preliminary results from the first transient test M3 are reviewed.« less
Pedwell, Rhianna K; Fraser, James A; Wang, Jack T H; Clegg, Jack K; Chartres, Jy D; Rowland, Susan L
2018-01-31
Course-integrated Undergraduate Research Experiences (CUREs) involve large numbers of students in real research. We describe a late-year microbiology CURE in which students use yeast to address a research question around beer brewing or synthesizing biofuel; the interdisciplinary student-designed project incorporates genetics, bioinformatics, biochemistry, analytical chemistry, and microbiology. Students perceived significant learning gains around multiple technical and "becoming a scientist" aspects of the project. The project is demanding for both the students and the academic implementers. We examine the rich landscape of support and interaction that this CURE both encourages and requires while also considering how we can support the exercise better and more sustainably. The findings from this study provide a picture of a CURE implementation that has begun to reach the limits of both the students' and the academics' capacities to complete it. © 2018 by The International Union of Biochemistry and Molecular Biology, 2018. © 2018 The International Union of Biochemistry and Molecular Biology.
Semi-analytical method of calculating the electrostatic interaction of colloidal solutions
NASA Astrophysics Data System (ADS)
Tian, Hongqing; Lian, Zengju
2017-01-01
Not Available Project supported by the National Natural Science Foundation of China (Grant No. 11304169), the Natural Science Foundation of Ningbo City, China (Grant No. 2012A610178), the Open Foundation of the Most Important Subjects of Zhejiang Province, China (Grant No. xkzwl1505), and K. C. Wong Magna Fund in Ningbo University of China.
ERIC Educational Resources Information Center
Warash, Bobbie; Curtis, Reagan; Hursh, Dan; Tucci, Vicci
2008-01-01
We focus on integrating developmentally appropriate practices, the project approach of Reggio Emilia, and a behavior analytic model to support a quality preschool environment. While the above practices often are considered incompatible, we have found substantial overlap and room for integration of these perspectives in practical application. With…
Prioritizing sewer rehabilitation projects using AHP-PROMETHEE II ranking method.
Kessili, Abdelhak; Benmamar, Saadia
2016-01-01
The aim of this paper is to develop a methodology for the prioritization of sewer rehabilitation projects for Algiers (Algeria) sewer networks to support the National Sanitation Office in its challenge to make decisions on prioritization of sewer rehabilitation projects. The methodology applies multiple-criteria decision making. The study includes 47 projects (collectors) and 12 criteria to evaluate them. These criteria represent the different issues considered in the prioritization of the projects, which are structural, hydraulic, environmental, financial, social and technical. The analytic hierarchy process (AHP) is used to determine weights of the criteria and the Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE II) method is used to obtain the final ranking of the projects. The model was verified using the sewer data of Algiers. The results have shown that the method can be used for prioritizing sewer rehabilitation projects.
NASA Astrophysics Data System (ADS)
Mao, Bin-Bin; Liu, Maoxin; Wu, Wei; Li, Liangsheng; Ying, Zu-Jian; Luo, Hong-Gang
2018-05-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11674139, 11604009, and 11704025), the Program for Changjiang Scholars and Innovative Research Team in University, China (Grant No. IRT-16R35), and the Fundamental Research Funds for the Central Universities, China. ZJY also acknowledges the financial support of the Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research of the European Commission, under FET-Open Grant No. 618083 (CNTQC).
Information support for decision making on dispatching control of water distribution in irrigation
NASA Astrophysics Data System (ADS)
Yurchenko, I. F.
2018-05-01
The research has been carried out on developing the technique of supporting decision making for on-line control, operational management of water allocation for the interfarm irrigation projects basing on the analytical patterns of dispatcher control. This technique provides an increase of labour productivity as well as higher management quality due to the improved level of automation, as well as decision making optimization taking into account diagnostics of the issues, solutions classification, information being required to the decision makers.
The Earth Data Analytic Services (EDAS) Framework
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2017-12-01
Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.
The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change
NASA Astrophysics Data System (ADS)
Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.
2015-12-01
The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindberg, Michael J.
2010-09-28
Between October 14, 2009 and February 22, 2010 sediment samples were received from 100-BC Decision Unit for geochemical studies. This is an analytical data report for sediments received from CHPRC at the 100 BC 5 OU. The analyses for this project were performed at the 325 building located in the 300 Area of the Hanford Site. The analyses were performed according to Pacific Northwest National Laboratory (PNNL) approved procedures and/or nationally recognized test procedures. The data sets include the sample identification numbers, analytical results, estimated quantification limits (EQL), and quality control data. The preparatory and analytical quality control requirements, calibrationmore » requirements, acceptance criteria, and failure actions are defined in the on-line QA plan 'Conducting Analytical Work in Support of Regulatory Programs' (CAW). This QA plan implements the Hanford Analytical Services Quality Assurance Requirements Documents (HASQARD) for PNNL.« less
Digital teaching tools and global learning communities.
Williams, Mary; Lockhart, Patti; Martin, Cathie
2015-01-01
In 2009, we started a project to support the teaching and learning of university-level plant sciences, called Teaching Tools in Plant Biology. Articles in this series are published by the plant science journal, The Plant Cell (published by the American Society of Plant Biologists). Five years on, we investigated how the published materials are being used through an analysis of the Google Analytics pageviews distribution and through a user survey. Our results suggest that this project has had a broad, global impact in supporting higher education, and also that the materials are used differently by individuals in terms of their role (instructor, independent learner, student) and geographical location. We also report on our ongoing efforts to develop a global learning community that encourages discussion and resource sharing.
Hanford Internal Dosimetry Project manual. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, E.H.; Bihl, D.E.; MacLellan, J.A.
1994-07-01
This document describes the Hanford Internal Dosimetry Project, as it is administered by Pacific Northwest Laboratory (PNL) in support of the US Department of Energy and its Hanford contractors. Project services include administrating the bioassay monitoring program, evaluating and documenting assessment of potential intakes and internal dose, ensuring that analytical laboratories conform to requirements, selecting and applying appropriate models and procedures for evaluating radionuclide deposition and the resulting dose, and technically guiding and supporting Hanford contractors in matters regarding internal dosimetry. Specific chapters deal with the following subjects: practices of the project, including interpretation of applicable DOE Orders, regulations, andmore » guidance into criteria for assessment, documentation, and reporting of doses; assessment of internal dose, including summary explanations of when and how assessments are performed; recording and reporting practices for internal dose; selection of workers for bioassay monitoring and establishment of type and frequency of bioassay measurements; capability and scheduling of bioassay monitoring services; recommended dosimetry response to potential internal exposure incidents; quality control and quality assurance provisions of the program.« less
A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry
ERIC Educational Resources Information Center
Adami, Gianpiero
2006-01-01
A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…
ENVIRONMENTAL EVALUATION FOR UTILIZATION OF ASH IN SOIL STABILIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
David J. Hassett; Loreal V. Heebink
2001-08-01
The Minnesota Pollution Control Agency (MPCA) approved the use of coal ash in soil stabilization, indicating that environmental data needed to be generated. The overall project goal is to evaluate the potential for release of constituents into the environment from ash used in soil stabilization projects. Supporting objectives are: (1) To ensure sample integrity through implementation of a sample collection, preservation, and storage protocol to avoid analyte concentration or loss. (2) To evaluate the potential of each component (ash, soil, water) of the stabilized soil to contribute to environmental release of analytes of interest. (3) To use laboratory leaching methodsmore » to evaluate the potential for release of constituents to the environment. (4) To facilitate collection of and to evaluate samples from a field runoff demonstration effort. The results of this study indicated limited mobility of the coal combustion fly ash constituents in laboratory tests and the field runoff samples. The results presented support previous work showing little to negligible impact on water quality. This and past work indicates that soil stabilization is an environmentally beneficial CCB utilization application as encouraged by the U.S. Environmental Protection Agency. This project addressed the regulatory-driven environmental aspect of fly ash use for soil stabilization, but the demonstrated engineering performance and economic advantages also indicate that the use of CCBs in soil stabilization can and should become an accepted engineering option.« less
NASA Astrophysics Data System (ADS)
Han, Shenchao; Yang, Yanchun; Liu, Yude; Zhang, Peng; Li, Siwei
2018-01-01
It is effective to reduce haze in winter by changing the distributed heat supply system. Thus, the studies on comprehensive index system and scientific evaluation method of distributed heat supply project are essential. Firstly, research the influence factors of heating modes, and an index system with multiple dimension including economic, environmental, risk and flexibility was built and all indexes were quantified. Secondly, a comprehensive evaluation method based on AHP was put forward to analyze the proposed multiple and comprehensive index system. Lastly, the case study suggested that supplying heat with electricity has great advantage and promotional value. The comprehensive index system of distributed heating supply project and evaluation method in this paper can evaluate distributed heat supply project effectively and provide scientific support for choosing the distributed heating project.
Lee, Chi-Ching; Chen, Yi-Ping Phoebe; Yao, Tzu-Jung; Ma, Cheng-Yu; Lo, Wei-Cheng; Lyu, Ping-Chiang; Tang, Chuan Yi
2013-04-10
Sequencing of microbial genomes is important because of microbial-carrying antibiotic and pathogenetic activities. However, even with the help of new assembling software, finishing a whole genome is a time-consuming task. In most bacteria, pathogenetic or antibiotic genes are carried in genomic islands. Therefore, a quick genomic island (GI) prediction method is useful for ongoing sequencing genomes. In this work, we built a Web server called GI-POP (http://gipop.life.nthu.edu.tw) which integrates a sequence assembling tool, a functional annotation pipeline, and a high-performance GI predicting module, in a support vector machine (SVM)-based method called genomic island genomic profile scanning (GI-GPS). The draft genomes of the ongoing genome projects in contigs or scaffolds can be submitted to our Web server, and it provides the functional annotation and highly probable GI-predicting results. GI-POP is a comprehensive annotation Web server designed for ongoing genome project analysis. Researchers can perform annotation and obtain pre-analytic information include possible GIs, coding/non-coding sequences and functional analysis from their draft genomes. This pre-analytic system can provide useful information for finishing a genome sequencing project. Copyright © 2012 Elsevier B.V. All rights reserved.
John Hof; Curtis Flather; Tony Baltic; Rudy King
2006-01-01
The 2005 Forest and Rangeland Condition Indicator Model is a set of classification trees for forest and rangeland condition indicators at the national scale. This report documents the development of the database and the nonparametric statistical estimation for this analytical structure, with emphasis on three special characteristics of condition indicator production...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena
The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less
Badri, Adel; Nadeau, Sylvie; Gbodossou, André
2012-09-01
Excluding occupational health and safety (OHS) from project management is no longer acceptable. Numerous industrial accidents have exposed the ineffectiveness of conventional risk evaluation methods as well as negligence of risk factors having major impact on the health and safety of workers and nearby residents. Lack of reliable and complete evaluations from the beginning of a project generates bad decisions that could end up threatening the very existence of an organization. This article supports a systematic approach to the evaluation of OHS risks and proposes a new procedure based on the number of risk factors identified and their relative significance. A new concept called risk factor concentration along with weighting of risk factor categories as contributors to undesirable events are used in the analytical hierarchy process multi-criteria comparison model with Expert Choice(©) software. A case study is used to illustrate the various steps of the risk evaluation approach and the quick and simple integration of OHS at an early stage of a project. The approach allows continual reassessment of criteria over the course of the project or when new data are acquired. It was thus possible to differentiate the OHS risks from the risk of drop in quality in the case of the factory expansion project. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.
VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less
ERIC Educational Resources Information Center
Herber, Daniel R.; Deshmukh, Anand P.; Mitchell, Marlon E.; Allison, James T.
2016-01-01
This paper presents an effort to revitalize a large introductory engineering course for incoming freshman students that teaches them analytical design through a project-based curriculum. This course was completely transformed from a seminar-based to a project-based course that integrates hands-on experimentation with analytical work. The project…
Building sustainable multi-functional prospective electronic clinical data systems.
Randhawa, Gurvaneet S; Slutsky, Jean R
2012-07-01
A better alignment in the goals of the biomedical research enterprise and the health care delivery system can help fill the large gaps in our knowledge of the impact of clinical interventions on patient outcomes in the real world. There are several initiatives underway to align the research priorities of patients, providers, researchers, and policy makers. These include Agency for Healthcare Research and Quality (AHRQ)-supported projects to build flexible prospective clinical electronic data infrastructure that meet the needs of these diverse users. AHRQ has previously supported the creation of 2 distributed research networks as a new approach to conduct comparative effectiveness research (CER) while protecting a patient's confidential information and the proprietary needs of a clinical organization. It has applied its experience in building these networks in directing the American Recovery and Reinvestment Act funds for CER to support new clinical electronic infrastructure projects that can be used for several purposes including CER, quality improvement, clinical decision support, and disease surveillance. In addition, AHRQ has funded a new Electronic Data Methods forum to advance the methods in clinical informatics, research analytics, and governance by actively engaging investigators from the American Recovery and Reinvestment Act-funded projects and external stakeholders.
Yuasa, Motoyuki; Yamaguchi, Yoshie; Imada, Mihoko
2013-09-22
The Japan International Cooperation Agency (JICA) has focused its attention on appraising health development assistance projects and redirecting efforts towards health system strengthening. This study aimed to describe the type of project and targets of interest, and assess the contribution of JICA health-related projects to strengthening health systems worldwide. We collected a web-based Project Design Matrix (PDM) of 105 JICA projects implemented between January 2005 and December 2009. We developed an analytical matrix based on the World Health Organization (WHO) health system framework to examine the PDM data and thereby assess the projects' contributions to health system strengthening. The majority of JICA projects had prioritized workforce development, and improvements in governance and service delivery. Conversely, there was little assistance for finance or medical product development. The vast majority (87.6%) of JICA projects addressed public health issues, for example programs to improve maternal and child health, and the prevention and treatment of infectious diseases such as AIDS, tuberculosis and malaria. Nearly 90% of JICA technical healthcare assistance directly focused on improving governance as the most critical means of accomplishing its goals. Our study confirmed that JICA projects met the goals of bilateral cooperation by developing workforce capacity and governance. Nevertheless, our findings suggest that JICA assistance could be used to support financial aspects of healthcare systems, which is an area of increasing concern. We also showed that the analytical matrix methodology is an effective means of examining the component of health system strengthening to which the activity and output of a project contributes. This may help policy makers and practitioners focus future projects on priority areas.
NASA Astrophysics Data System (ADS)
Wirth, K. R.; Garver, J. I.; Greer, L.; Pollock, M.; Varga, R. J.; Davidson, C. M.; Frey, H. M.; Hubbard, D. K.; Peck, W. H.; Wobus, R. A.
2015-12-01
The Keck Geology Consortium, with support from the National Science Foundation (REU Program) and ExxonMobil, is a collaborative effort by 18 colleges to improve geoscience education through high-quality research experiences. Since its inception in 1987 more than 1350 undergraduate students and 145 faculty have been involved in 189 yearlong research projects. This non-traditional REU model offers exceptional opportunities for students to address research questions at a deep level, to learn and utilize sophisticated analytical methods, and to engage in authentic collaborative research that culminates in an undergraduate research symposium and published abstracts volume. The large numbers of student and faculty participants in Keck projects also affords a unique opportunity to study the impacts of program design on undergraduate research experiences in the geosciences. Students who participate in Keck projects generally report significant gains in personal and professional dimensions, as well as in clarification of educational and career goals. Survey data from student participants, project directors, and campus advisors identify mentoring as one of the most critical and challenging elements of successful undergraduate research experiences. Additional challenges arise from the distributed nature of Keck projects (i.e., participants, project directors, advisors, and other collaborators are at different institutions) and across the span of yearlong projects. In an endeavor to improve student learning about the nature and process of science, and to make mentoring practices more intentional, the Consortium has developed workshops and materials to support both project directors and campus research advisors (e.g., best practices for mentoring, teaching ethical professional conduct, benchmarks for progress, activities to support students during research process). The Consortium continues to evolve its practices to better support students from underrepresented groups.
Decision problems in management of construction projects
NASA Astrophysics Data System (ADS)
Szafranko, E.
2017-10-01
In a construction business, one must oftentimes make decisions during all stages of a building process, from planning a new construction project through its execution to the stage of using a ready structure. As a rule, the decision making process is made more complicated due to certain conditions specific for civil engineering. With such diverse decision situations, it is recommended to apply various decision making support methods. Both, literature and hands-on experience suggest several methods based on analytical and computational procedures, some less and some more complex. This article presents the methods which can be helpful in supporting decision making processes in the management of civil engineering projects. These are multi-criteria methods, such as MCE, AHP or indicator methods. Because the methods have different advantages and disadvantages, whereas decision situations have their own specific nature, a brief summary of the methods alongside some recommendations regarding their practical applications has been given at the end of the paper. The main aim of this article is to review the methods of decision support and their analysis for possible use in the construction industry.
Men'shikov, V V
2012-12-01
The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.
2017-10-01
Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.
NASA Technical Reports Server (NTRS)
Gates, R. M.; Williams, J. E.
1974-01-01
Results are given of analytical studies performed in support of the design, implementation, checkout and use of NASA's dynamic docking test system (DDTS). Included are analyses of simulator components, a list of detailed operational test procedures, a summary of simulator performance, and an analysis and comparison of docking dynamics and loads obtained by test and analysis.
Analytic innovations for air quality modeling | Science ...
The presentation provides an overview of ongoing research activities at the U.S. EPA, focusing on improving long-term emission projections and the development of decision support systems for coordinated environmental, climate and energy planning. This presentation will be given on October 10th, 2016, at the Johns Hopkins Dept. of Environmental Health and Engineering as part of the Environmental Science and Management Seminar Series.
2015-12-01
Army Training and Doctrine Command (TRADOC) Analysis Center (TRAC) to the Philippines for Operation ENDURING FREEDOM – Philippines (OEF-P). PROJECT...management, doctrine and force development, training management, system testing, system acquisition, decision analysis, and resource management, as...influenced procurement decisions and reshaped Army doctrine . Additionally, CAA itself has benefited in numerous ways. Combat experience provides analysts
Wetherbee, Gregory A.; Martin, RoseAnn
2017-02-06
The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.
2013-01-01
Background The Japan International Cooperation Agency (JICA) has focused its attention on appraising health development assistance projects and redirecting efforts towards health system strengthening. This study aimed to describe the type of project and targets of interest, and assess the contribution of JICA health-related projects to strengthening health systems worldwide. Methods We collected a web-based Project Design Matrix (PDM) of 105 JICA projects implemented between January 2005 and December 2009. We developed an analytical matrix based on the World Health Organization (WHO) health system framework to examine the PDM data and thereby assess the projects’ contributions to health system strengthening. Results The majority of JICA projects had prioritized workforce development, and improvements in governance and service delivery. Conversely, there was little assistance for finance or medical product development. The vast majority (87.6%) of JICA projects addressed public health issues, for example programs to improve maternal and child health, and the prevention and treatment of infectious diseases such as AIDS, tuberculosis and malaria. Nearly 90% of JICA technical healthcare assistance directly focused on improving governance as the most critical means of accomplishing its goals. Conclusions Our study confirmed that JICA projects met the goals of bilateral cooperation by developing workforce capacity and governance. Nevertheless, our findings suggest that JICA assistance could be used to support financial aspects of healthcare systems, which is an area of increasing concern. We also showed that the analytical matrix methodology is an effective means of examining the component of health system strengthening to which the activity and output of a project contributes. This may help policy makers and practitioners focus future projects on priority areas. PMID:24053583
Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D
2012-12-01
The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2016-12-01
We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons
Big Data Analytics Methodology in the Financial Industry
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony
2017-01-01
Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…
Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions
ERIC Educational Resources Information Center
Berge, Maria; Ingerman, Åke
2017-01-01
Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…
Preliminary Results on Uncertainty Quantification for Pattern Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene
2015-09-01
This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search,more » and discuss a number of possible improvements for each.« less
Big data analytics workflow management for eScience
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni
2015-04-01
In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.
NASA Astrophysics Data System (ADS)
Mattmann, C. A.
2013-12-01
A wave of open source big data analytic infrastructure is currently shaping government, private sector, and academia. Projects are consuming, adapting, and contributing back to various ecosystems of software e.g., the Apache Hadoop project and its ecosystem of related efforts including Hive, HBase, Pig, Oozie, Ambari, Knox, Tez and Yarn, to name a few; the Berkeley AMPLab stack which includes Spark, Shark, Mesos, Tachyon, BlinkDB, MLBase, and other emerging efforts; MapR and its related stack of technologies, offerings from commercial companies building products around these tools e.g., Hortonworks Data Platform (HDP), Cloudera's CDH project, etc. Though the technologies all offer different capabilities including low latency support/in-memory, versus record oriented file I/O, high availability, support for the Map Reduce programming paradigm or other dataflow/workflow constructs, there is a common thread that binds these products - they are all released under an open source license e.g., Apache2, MIT, BSD, GPL/LGPL, etc.; all thrive in various ecosystems, such as Apache, or Berkeley AMPLab; all are developed collaboratively, and all technologies provide plug in architecture models and methodologies for allowing others to contribute, and participate via various community models. This talk will cover the open source aspects and governance aspects of the aforementioned Big Data ecosystems and point out the differences, subtleties, and implications of those differences. The discussion will be by example, using several national deployments and Big Data initiatives stemming from the Administration including DARPA's XDATA program; NASA's CMAC program; NSF's EarthCube and geosciences BigData projects. Lessons learned from these efforts in terms of the open source aspects of these technologies will help guide the AGU community in their use, deployment and understanding.
McCulloh, Russell J; Fouquet, Sarah D; Herigon, Joshua; Biondi, Eric A; Kennedy, Brandan; Kerns, Ellen; DePorre, Adrienne; Markham, Jessica L; Chan, Y Raymond; Nelson, Krista; Newland, Jason G
2018-06-07
Implementing evidence-based practices requires a multi-faceted approach. Electronic clinical decision support (ECDS) tools may encourage evidence-based practice adoption. However, data regarding the role of mobile ECDS tools in pediatrics is scant. Our objective is to describe the development, distribution, and usage patterns of a smartphone-based ECDS tool within a national practice standardization project. We developed a smartphone-based ECDS tool for use in the American Academy of Pediatrics, Value in Inpatient Pediatrics Network project entitled "Reducing Excessive Variation in the Infant Sepsis Evaluation (REVISE)." The mobile application (app), PedsGuide, was developed using evidence-based recommendations created by an interdisciplinary panel. App workflow and content were aligned with clinical benchmarks; app interface was adjusted after usability heuristic review. Usage patterns were measured using Google Analytics. Overall, 3805 users across the United States downloaded PedsGuide from December 1, 2016, to July 31, 2017, leading to 14 256 use sessions (average 3.75 sessions per user). Users engaged in 60 442 screen views, including 37 424 (61.8%) screen views that displayed content related to the REVISE clinical practice benchmarks, including hospital admission appropriateness (26.8%), length of hospitalization (14.6%), and diagnostic testing recommendations (17.0%). Median user touch depth was 5 [IQR 5]. We observed rapid dissemination and in-depth engagement with PedsGuide, demonstrating feasibility for using smartphone-based ECDS tools within national practice improvement projects. ECDS tools may prove valuable in future national practice standardization initiatives. Work should next focus on developing robust analytics to determine ECDS tools' impact on medical decision making, clinical practice, and health outcomes.
Survivability as a Tool for Evaluating Open Source Software
2015-06-01
the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses
Rapid Building Assessment Project
2014-05-01
ongoing management of commercial energy efficiency. No other company offers all of these proven services on a seamless, integrated Software -as-a- Service ...FirstFuel has added a suite of additional Software -as-a- Service analytics capabilities to support the entire energy efficiency lifecycle, including...the client side. In this document, we refer to the service side software as “BUILDER” and the client software as “BuilderRED,” following the Army
DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY
A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...
Toward a Systematic Approach for Selection of NASA Technology Portfolios
NASA Technical Reports Server (NTRS)
Weisbin, Charles R.; Rodriguez, Guillermo; Alberto, Elfes; Smith, Jeffrey H.
2004-01-01
There is an important need for a consistent analytical foundation supporting the selection and monitoring of R&D tasks that support new system concepts that enable future NASA missions. This capability should be applicable at various degrees of abstraction, depending upon whether one is interested in formulation, development, or operations. It should also be applicable to a single project, a program comprised of a group of projects, an enterprise typically including multiple programs, and the overall agency itself. Emphasis here is on technology selection and new initiatives, but the same approach can be generalized to other applications, dealing, for example, with new system architectures, risk reduction, and task allocation among humans and machines. The purpose of this paper is to describe one such approach, which is in its early stages of implementation within NASA programs, and to discuss several illustrative examples.
Technical requirements for bioassay support services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hickman, D.P.; Anderson, A.L.
1991-05-01
This document provides the technical basis for the Chem-Nuclear Geotech (Geotech) bioassay program. It includes information and details that can be used as a model in providing technical contents and requirements for bioassay laboratory support, either internally or in solicitations by Geotech to obtain subcontractor laboratory support. It provides a detailed summary and description of the types of bioassay samples to be expected in support of Geotech remedial projects for the US Department of Energy and the bioassay services and analytical requirements necessary to process such samples, including required limits of sensitivity. General responsibilities of the bioassay laboratory are alsomore » addressed, including quality assurance. Peripheral information of importance to the program is included in the appendices of this document. 7 tabs.« less
NASA Astrophysics Data System (ADS)
Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd
2009-05-01
Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engels, J.
The Environmental Restoration (ER) Program was established for the investigation and remediation of inactive US Department of Energy (DOE) sites and facilities that have been declared surplus in terms of their previous uses. The purpose of this document is to Specify ER requirements for quality control (QC) of analytical data. Activities throughout all phases of the investigation may affect the quality of the final data product, thus are subject to control specifications. Laboratory control is emphasized in this document, and field concerns will be addressed in a companion document Energy Systems, in its role of technical coordinator and at themore » request of DOE-OR, extends the application of these requirements to all participants in ER activities. Because every instance and concern may not be addressed in this document, participants are encouraged to discuss any questions with the ER Quality Assurance (QA) Office, the Analytical Environmental Support Group (AESG), or the Analytical Project Office (APO).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohi, J.
Supporting analysis and assessments can provide a sound analytic foundation and focus for program planning, evaluation, and coordination, particularly if issues of hydrogen production, distribution, storage, safety, and infrastructure can be analyzed in a comprehensive and systematic manner. The overall purpose of this activity is to coordinate all key analytic tasks-such as technology and market status, opportunities, and trends; environmental costs and benefits; and regulatory constraints and opportunities-within a long-term and systematic analytic foundation for program planning and evaluation. Within this context, the purpose of the project is to help develop and evaluate programmatic pathway options that incorporate near andmore » mid-term strategies to achieve the long-term goals of the Hydrogen Program. In FY 95, NREL will develop a comprehensive effort with industry, state and local agencies, and other federal agencies to identify and evaluate programmatic pathway options to achieve the long-term goals of the Program. Activity to date is reported.« less
Analytic hierarchy process helps select site for limestone quarry expansion in Barbados.
Dey, Prasanta Kumar; Ramcharan, Eugene K
2008-09-01
Site selection is a key activity for quarry expansion to support cement production, and is governed by factors such as resource availability, logistics, costs, and socio-economic-environmental factors. Adequate consideration of all the factors facilitates both industrial productivity and sustainable economic growth. This study illustrates the site selection process that was undertaken for the expansion of limestone quarry operations to support cement production in Barbados. First, alternate sites with adequate resources to support a 25-year development horizon were identified. Second, technical and socio-economic-environmental factors were then identified. Third, a database was developed for each site with respect to each factor. Fourth, a hierarchical model in analytic hierarchy process (AHP) framework was then developed. Fifth, the relative ranking of the alternate sites was then derived through pair wise comparison in all the levels and through subsequent synthesizing of the results across the hierarchy through computer software (Expert Choice). The study reveals that an integrated framework using the AHP can help select a site for the quarry expansion project in Barbados.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
NASA Astrophysics Data System (ADS)
Irving, D. H.; Rasheed, M.; Hillman, C.; O'Doherty, N.
2012-12-01
Oilfield management is moving to a more operational footing with near-realtime seismic and sensor monitoring governing drilling, fluid injection and hydrocarbon extraction workflows within safety, productivity and profitability constraints. To date, the geoscientific analytical architectures employed are configured for large volumes of data, computational power or analytical latency and compromises in system design must be made to achieve all three aspects. These challenges are encapsulated by the phrase 'Big Data' which has been employed for over a decade in the IT industry to describe the challenges presented by data sets that are too large, volatile and diverse for existing computational architectures and paradigms. We present a data-centric architecture developed to support a geoscientific and geotechnical workflow whereby: ●scientific insight is continuously applied to fresh data ●insights and derived information are incorporated into engineering and operational decisions ●data governance and provenance are routine within a broader data management framework Strategic decision support systems in large infrastructure projects such as oilfields are typically relational data environments; data modelling is pervasive across analytical functions. However, subsurface data and models are typically non-relational (i.e. file-based) in the form of large volumes of seismic imaging data or rapid streams of sensor feeds and are analysed and interpreted using niche applications. The key architectural challenge is to move data and insight from a non-relational to a relational, or structured, data environment for faster and more integrated analytics. We describe how a blend of MapReduce and relational database technologies can be applied in geoscientific decision support, and the strengths and weaknesses of each in such an analytical ecosystem. In addition we discuss hybrid technologies that use aspects of both and translational technologies for moving data and analytics across these platforms. Moving to a data-centric architecture requires data management methodologies to be overhauled by default and we show how end-to-end data provenancing and dependency management is implicit in such an environment and how it benefits system administration as well as the user community. Whilst the architectural experiences are drawn from the oil industry, we believe that they are more broadly applicable in academic and government settings where large volumes of data are added to incrementally and require revisiting with low analytical latency and we suggest application to earthquake monitoring and remote sensing networks.
2005-04-01
RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to
Controlled ecological life support system: Transportation analysis
NASA Technical Reports Server (NTRS)
Gustan, E.; Vinopal, T.
1982-01-01
This report discusses a study utilizing a systems analysis approach to determine which NASA missions would benefit from controlled ecological life support system (CELSS) technology. The study focuses on manned missions selected from NASA planning forecasts covering the next half century. Comparison of various life support scenarios for the selected missions and characteristics of projected transportation systems provided data for cost evaluations. This approach identified missions that derived benefits from a CELSS, showed the magnitude of the potential cost savings, and indicated which system or combination of systems would apply. This report outlines the analytical approach used in the evaluation, describes the missions and systems considered, and sets forth the benefits derived from CELSS when applicable.
Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.
2014-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. Analytics within NEX occurs at several levels - data, workflows, science and knowledge. At the data level, we are focusing on collecting and analyzing any information that is relevant to efficient acquisition, processing and management of data at the smallest granularity, such as files or collections. This includes processing and analyzing all local and many external metadata that are relevant to data quality, size, provenance, usage and other attributes. This then helps us better understand usage patterns and improve efficiency of data handling within NEX. When large-scale workflows are executed on NEX, we capture information that is relevant to processing and that can be analyzed in order to improve efficiencies in job scheduling, resource optimization, or data partitioning that would improve processing throughput. At this point we also collect data provenance as well as basic statistics of intermediate and final products created during the workflow execution. These statistics and metrics form basic process and data QA that, when combined with analytics algorithms, helps us identify issues early in the production process. We have already seen impact in some petabyte-scale projects, such as global Landsat processing, where we were able to reduce processing times from days to hours and enhance process monitoring and QA. While the focus so far has been mostly on support of NEX operations, we are also building a web-based infrastructure that enables users to perform direct analytics on science data - such as climate predictions or satellite data. Finally, as one of the main goals of NEX is knowledge acquisition and sharing, we began gathering and organizing information that associates users and projects with data, publications, locations and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Converse, Sarah J.; Shelley, Kevin J.; Morey, Steve; Chan, Jeffrey; LaTier, Andrea; Scafidi, Carolyn; Crouse, Deborah T.; Runge, Michael C.
2011-01-01
The resources available to support conservation work, whether time or money, are limited. Decision makers need methods to help them identify the optimal allocation of limited resources to meet conservation goals, and decision analysis is uniquely suited to assist with the development of such methods. In recent years, a number of case studies have been described that examine optimal conservation decisions under fiscal constraints; here we develop methods to look at other types of constraints, including limited staff and regulatory deadlines. In the US, Section Seven consultation, an important component of protection under the federal Endangered Species Act, requires that federal agencies overseeing projects consult with federal biologists to avoid jeopardizing species. A benefit of consultation is negotiation of project modifications that lessen impacts on species, so staff time allocated to consultation supports conservation. However, some offices have experienced declining staff, potentially reducing the efficacy of consultation. This is true of the US Fish and Wildlife Service's Washington Fish and Wildlife Office (WFWO) and its consultation work on federally-threatened bull trout (Salvelinus confluentus). To improve effectiveness, WFWO managers needed a tool to help allocate this work to maximize conservation benefits. We used a decision-analytic approach to score projects based on the value of staff time investment, and then identified an optimal decision rule for how scored projects would be allocated across bins, where projects in different bins received different time investments. We found that, given current staff, the optimal decision rule placed 80% of informal consultations (those where expected effects are beneficial, insignificant, or discountable) in a short bin where they would be completed without negotiating changes. The remaining 20% would be placed in a long bin, warranting an investment of seven days, including time for negotiation. For formal consultations (those where expected effects are significant), 82% of projects would be placed in a long bin, with an average time investment of 15. days. The WFWO is using this decision-support tool to help allocate staff time. Because workload allocation decisions are iterative, we describe a monitoring plan designed to increase the tool's efficacy over time. This work has general application beyond Section Seven consultation, in that it provides a framework for efficient investment of staff time in conservation when such time is limited and when regulatory deadlines prevent an unconstrained approach. ?? 2010.
A big data approach for climate change indicators processing in the CLIP-C project
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Palazzo, Cosimo; Fiore, Sandro; Aloisio, Giovanni
2016-04-01
Defining and implementing processing chains with multiple (e.g. tens or hundreds of) data analytics operators can be a real challenge in many practical scientific use cases such as climate change indicators. This is usually done via scripts (e.g. bash) on the client side and requires climate scientists to take care of, implement and replicate workflow-like control logic aspects (which may be error-prone too) in their scripts, along with the expected application-level part. Moreover, the big amount of data and the strong I/O demand pose additional challenges related to the performance. In this regard, production-level tools for climate data analysis are mostly sequential and there is a lack of big data analytics solutions implementing fine-grain data parallelism or adopting stronger parallel I/O strategies, data locality, workflow optimization, etc. High-level solutions leveraging on workflow-enabled big data analytics frameworks for eScience could help scientists in defining and implementing the workflows related to their experiments by exploiting a more declarative, efficient and powerful approach. This talk will start introducing the main needs and challenges regarding big data analytics workflow management for eScience and will then provide some insights about the implementation of some real use cases related to some climate change indicators on large datasets produced in the context of the CLIP-C project - a EU FP7 project aiming at providing access to climate information of direct relevance to a wide variety of users, from scientists to policy makers and private sector decision makers. All the proposed use cases have been implemented exploiting the Ophidia big data analytics framework. The software stack includes an internal workflow management system, which coordinates, orchestrates, and optimises the execution of multiple scientific data analytics and visualization tasks. Real-time workflow monitoring execution is also supported through a graphical user interface. In order to address the challenges of the use cases, the implemented data analytics workflows include parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, and import/export of datasets in NetCDF format. The use cases have been implemented on a HPC cluster of 8-nodes (16-cores/node) of the Athena Cluster available at the CMCC Supercomputing Centre. Benchmark results will be also presented during the talk.
Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.
2013-01-01
Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960
Update on SLD Engineering Tools Development
NASA Technical Reports Server (NTRS)
Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.
2004-01-01
The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.
Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G
2000-07-01
Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.
2015-06-01
public release; distribution is unlimited. The US Army Engineer Research and Development Center (ERDC) solves the nation’s toughest engineering and...Framework (PIAF) Timothy K. Perkins and Chris C. Rewerts Construction Engineering Research Laboratory U.S. Army Engineer Research and Development Center...Prepared for U.S. Army Corps of Engineers Washington, DC 20314-1000 Under Project P2 335530, “Cultural Reasoning and Ethnographic Analysis for the
Long, H. Keith; Daddow, Richard L.; Farrar, Jerry W.
1998-01-01
Since 1962, the U.S. Geological Survey (USGS) has operated the Standard Reference Sample Project to evaluate the performance of USGS, cooperator, and contractor analytical laboratories that analyze chemical constituents of environmental samples. The laboratories are evaluated by using performance evaluation samples, called Standard Reference Samples (SRSs). SRSs are submitted to laboratories semi-annually for round-robin laboratory performance comparison purposes. Currently, approximately 100 laboratories are evaluated for their analytical performance on six SRSs for inorganic and nutrient constituents. As part of the SRS Project, a surplus of homogeneous, stable SRSs is maintained for purchase by USGS offices and participating laboratories for use in continuing quality-assurance and quality-control activities. Statistical evaluation of the laboratories results provides information to compare the analytical performance of the laboratories and to determine possible analytical deficiences and problems. SRS results also provide information on the bias and variability of different analytical methods used in the SRS analyses.
Putting the New Spectrometer to Work (Part II)
NASA Astrophysics Data System (ADS)
Menke, John
2013-05-01
Having shown at the 2012 Symposium on Telescope Science the capability of the home-built medium resolution (R=lambda/delta_lambda=3000) f/3.5 spectrometer on the 0.45-meter Newtonian, I collaborated on three new science observation projects. Project 1 investigated the spectacular Doppler signal of the mag 6 pulsating star BW Vul (velocities vary by >200km/sec in a 4.8 hr period). Project 2 searched for a velocity variation in a faint (mag 9) suspected spectroscopic binary (SAO186171 = Zug 1 in NGC6520) which reaches barely 25° above the southern horizon. Project 3 observed Sig Ori E, a mag 6 apparent Be star with 1.2 day period where the problem was to measure the rapidly changing amplitude and shape of the Halpha line to support modeling. Each project presented unique observational and analytic challenges that will be discussed. The projects have benefited from a close pro-am collaboration both in selecting targets and interpreting results.
Abernethy, Amy P; Wheeler, Jane L; Bull, Janet
2011-05-01
Few hospice and palliative care organizations use health information technology (HIT) for data collection and management; the feasibility and utility of a HIT-based approach in this multi-faceted, interdisciplinary context is unclear. To develop a HIT-based data infrastructure that serves multiple hospice and palliative care sites, meeting clinical and administrative needs with data, technical, and analytic support. Through a multi-site academic/community partnership, a data infrastructure was collaboratively developed, pilot-tested at a community-based site, refined, and demonstrated for data collection and preliminary analysis. Additional sites, which participated in system development, became prepared to contribute data to the growing aggregate database. Electronic data collection proved feasible in community-based hospice and palliative care. The project highlighted "success factors" for implementing HIT in this field: engagement of site-based project "champions" to promote the system from within; involvement of stakeholders at all levels of the organization, to promote culture change and buy-in; attention to local needs (e.g., data for quality reporting) and requirements (e.g., affordable cost, efficiency); consideration of practical factors (e.g., potential to interfere with clinical flow); provision of adequate software, technical, analytic, and statistical support; availability of flexible HIT options (e.g., different data-collection platforms); and adoption of a consortium approach in which sites can support one another, learn from each others' experiences, pool data, and benefit from economies of scale. In hospice and palliative care, HIT-based data collection/management has potential to generate better understanding of populations and outcomes, support quality assessment/quality improvement, and prepare sites to participate in research. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Basford, R. C.
1977-01-01
Analytical studies supported by experimental testing indicate that solar energy can be utilized to heat and cool commercial buildings. In a 50,000 square foot one-story office building at the Langley Research Center, 15,000 square feet of solar collectors are designed to provide the energy required to supply 79 percent of the building heating needs and 52 percent of its cooling needs. The experience gained from the space program is providing the technology base for this project. Included are some of the analytical studies made to make the building design changes necessary to utilize solar energy, the basic solar collector design, collector efficiencies, and the integrated system design.
Managing laboratory automation in a changing pharmaceutical industry
Rutherford, Michael L.
1995-01-01
The health care reform movement in the USA and increased requirements by regulatory agencies continue to have a major impact on the pharmaceutical industry and the laboratory. Laboratory management is expected to improve effciency by providing more analytical results at a lower cost, increasing customer service, reducing cycle time, while ensuring accurate results and more effective use of their staff. To achieve these expectations, many laboratories are using robotics and automated work stations. Establishing automated systems presents many challenges for laboratory management, including project and hardware selection, budget justification, implementation, validation, training, and support. To address these management challenges, the rationale for project selection and implementation, the obstacles encountered, project outcome, and learning points for several automated systems recently implemented in the Quality Control Laboratories at Eli Lilly are presented. PMID:18925014
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Medical Applications and Biophysical Research Division of the Office of Biological and Environmental Research supports and manages research in several distinct areas of science and technology. The projects described in this book are grouped by the main budgetary areas: General Life Sciences (structural molecular biology), Medical Applications (primarily nuclear medicine) and Measurement Science (analytical chemistry instrumentation), Environmental Management Science Program, and the Small Business Innovation Research Program. The research funded by this division complements that of the other two divisions in the Office of Biological and Environmental Research (OBER): Health Effects and Life Sciences Research, and Environmental Sciences. Mostmore » of the OBER programs are planned and administered jointly by the staff of two or all three of the divisions. This summary book provides information on research supported in these program areas during Fiscal Years 1996 and 1997.« less
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
Critical factors for assembling a high volume of DNA barcodes
Hajibabaei, Mehrdad; deWaard, Jeremy R; Ivanova, Natalia V; Ratnasingham, Sujeevan; Dooh, Robert T; Kirk, Stephanie L; Mackie, Paula M; Hebert, Paul D.N
2005-01-01
Large-scale DNA barcoding projects are now moving toward activation while the creation of a comprehensive barcode library for eukaryotes will ultimately require the acquisition of some 100 million barcodes. To satisfy this need, analytical facilities must adopt protocols that can support the rapid, cost-effective assembly of barcodes. In this paper we discuss the prospects for establishing high volume DNA barcoding facilities by evaluating key steps in the analytical chain from specimens to barcodes. Alliances with members of the taxonomic community represent the most effective strategy for provisioning the analytical chain with specimens. The optimal protocols for DNA extraction and subsequent PCR amplification of the barcode region depend strongly on their condition, but production targets of 100K barcode records per year are now feasible for facilities working with compliant specimens. The analysis of museum collections is currently challenging, but PCR cocktails that combine polymerases with repair enzyme(s) promise future success. Barcode analysis is already a cost-effective option for species identification in some situations and this will increasingly be the case as reference libraries are assembled and analytical protocols are simplified. PMID:16214753
A dashboard-based system for supporting diabetes care.
Dagliati, Arianna; Sacchi, Lucia; Tibollo, Valentina; Cogni, Giulia; Teliti, Marsida; Martinez-Millana, Antonio; Traver, Vicente; Segagni, Daniele; Posada, Jorge; Ottaviano, Manuel; Fico, Giuseppe; Arredondo, Maria Teresa; De Cata, Pasquale; Chiovato, Luca; Bellazzi, Riccardo
2018-05-01
To describe the development, as part of the European Union MOSAIC (Models and Simulation Techniques for Discovering Diabetes Influence Factors) project, of a dashboard-based system for the management of type 2 diabetes and assess its impact on clinical practice. The MOSAIC dashboard system is based on predictive modeling, longitudinal data analytics, and the reuse and integration of data from hospitals and public health repositories. Data are merged into an i2b2 data warehouse, which feeds a set of advanced temporal analytic models, including temporal abstractions, care-flow mining, drug exposure pattern detection, and risk-prediction models for type 2 diabetes complications. The dashboard has 2 components, designed for (1) clinical decision support during follow-up consultations and (2) outcome assessment on populations of interest. To assess the impact of the clinical decision support component, a pre-post study was conducted considering visit duration, number of screening examinations, and lifestyle interventions. A pilot sample of 700 Italian patients was investigated. Judgments on the outcome assessment component were obtained via focus groups with clinicians and health care managers. The use of the decision support component in clinical activities produced a reduction in visit duration (P ≪ .01) and an increase in the number of screening exams for complications (P < .01). We also observed a relevant, although nonstatistically significant, increase in the proportion of patients receiving lifestyle interventions (from 69% to 77%). Regarding the outcome assessment component, focus groups highlighted the system's capability of identifying and understanding the characteristics of patient subgroups treated at the center. Our study demonstrates that decision support tools based on the integration of multiple-source data and visual and predictive analytics do improve the management of a chronic disease such as type 2 diabetes by enacting a successful implementation of the learning health care system cycle.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.
Hartzband, David; Jacobs, Feygele
2016-01-01
As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Data awareness, that is, an appreciation of the importance of data integrity, data hygiene 2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable.
Deployment of Analytics into the Healthcare Safety Net: Lessons Learned
Hartzband, David; Jacobs, Feygele
2016-01-01
Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Conclusion Data awareness, that is, an appreciation of the importance of data integrity, data hygiene2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable. PMID:28210424
An Analysis of Rocket Propulsion Testing Costs
NASA Technical Reports Server (NTRS)
Ramirez-Pagan, Carmen P.; Rahman, Shamim A.
2009-01-01
The primary mission at NASA Stennis Space Center (SSC) is rocket propulsion testing. Such testing is generally performed within two arenas: (1) Production testing for certification and acceptance, and (2) Developmental testing for prototype or experimental purposes. The customer base consists of NASA programs, DOD programs, and commercial programs. Resources in place to perform on-site testing include both civil servants and contractor personnel, hardware and software including data acquisition and control, and 6 test stands with a total of 14 test positions/cells. For several business reasons there is the need to augment understanding of the test costs for all the various types of test campaigns. Historical propulsion test data was evaluated and analyzed in many different ways with the intent to find any correlation or statistics that could help produce more reliable and accurate cost estimates and projections. The analytical efforts included timeline trends, statistical curve fitting, average cost per test, cost per test second, test cost timeline, and test cost envelopes. Further, the analytical effort includes examining the test cost from the perspective of thrust level and test article characteristics. Some of the analytical approaches did not produce evidence strong enough for further analysis. Some other analytical approaches yield promising results and are candidates for further development and focused study. Information was organized for into its elements: a Project Profile, Test Cost Timeline, and Cost Envelope. The Project Profile is a snap shot of the project life cycle on a timeline fashion, which includes various statistical analyses. The Test Cost Timeline shows the cumulative average test cost, for each project, at each month where there was test activity. The Test Cost Envelope shows a range of cost for a given number of test(s). The supporting information upon which this study was performed came from diverse sources and thus it was necessary to build several intermediate databases in order to understand, validate, and manipulate data. These intermediate databases (validated historical account of schedule, test activity, and cost) by themselves are of great value and utility. For example, for the Project Profile, we were able to merged schedule, cost, and test activity. This kind of historical account conveys important information about sequence of events, lead time, and opportunities for improvement in future propulsion test projects. The Product Requirement Document (PRD) file is a collection of data extracted from each project PRD (technical characteristics, test requirements, and projection of cost, schedule, and test activity). This information could help expedite the development of future PRD (or equivalent document) on similar projects, and could also, when compared to the actual results, help improve projections around cost and schedule. Also, this file can be sorted by the parameter of interest to perform a visual review of potential common themes or trends. The process of searching, collecting, and validating propulsion test data encountered a lot of difficulties which then led to a set of recommendations for improvement in order to facilitate future data gathering and analysis.
2006-10-01
high probability for success. Estimated Time to Complete: 31 May 2007. 4. Support and Upgrade of Armed Forces-CARES to integrate Chaplin ...Excellence (ORCEN) is to provide a small, full- time analytical capability to both the Academy and the United States Army and the Department of...complete significant research projects in this time as they usually require little train-up as they are exposed to many military and academic
Wetherbee, Gregory A.; Martin, RoseAnn
2016-07-05
The Mercury Deposition Network programs include the system blank program and an interlaboratory comparison program. System blank results indicated that maximum total mercury contamination concentrations in samples were less than the third percentile of all Mercury Deposition Network sample concentrations. The Mercury Analytical Laboratory produced chemical concentration results with low bias and variability compared with other domestic and international laboratories that support atmospheric-deposition monitoring.
Non-Thermal, On-Site Decontamination and Destruction of Practice Bombs
2006-06-12
extraction (SW1311) followed by analysis (SW6010B with analysis for Mercury SW7470). The samples taken from the process tanks indicated in Table 3...A: Analytical Methods Supporting Project CD-ROM 7471A - 1 Revision 1 September 1994 METHOD 7471A MERCURY IN SOLID OR SEMISOLID WASTE (MANUAL...COLD-VAPOR TECHNIQUE) 1.0 SCOPE AND APPLICATION 1.1 Method 7471 is approved for measuring total mercury (organic and inorganic) in soils, sediments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna; Bly, Aaron
The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to support a research effort focused on data analytics. It was suggested that the effort would develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics.« less
NASA Redox Storage System Development Project
NASA Technical Reports Server (NTRS)
Hagedorn, N. H.
1984-01-01
The Redox Storage System Technology Project was jointly supported by the U.S. Department of Energy and NASA. The objectives of the project were to develop the Redox flow battery concept and to probe its technical and economic viability. The iron and chromium redox couples were selected as the reactants. Membranes and electrodes were developed for the original mode of operating at 25 C with the reactants separated by an ion-exchange membrane. Analytical capabilities and system-level operating concepts were developed and verified in a 1-kW, 13-kWh preprototype system. A subsequent change was made in operating mode, going to 65 C and using mixed reactants. New membranes and a new electrode catalyst were developed, resulting in single cell operation as high as 80 mA/sq cm with energy efficiencies greater than 80 percent. Studies indicate a likely system cost of about $75/kWh. Standard Oil of Ohio (Sohio) has undertaken further development of the Redox system. An exclusive patent license was obtained from NASA by Sohio. Transfer of Redox technology to Sohio is supported by the NASA Technology Utilization Office.
Seamless Digital Environment – Plan for Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Bly, Aaron Douglas
The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to support a research effort focused on data analytics. It was suggested that the effort would develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics.« less
Rye, Robert O.; Johnson, Craig A.; Landis, Gary P.; Hofstra, Albert H.; Emsbo, Poul; Stricker, Craig A.; Hunt, Andrew G.; Rusk, Brian G.
2010-01-01
Principal functions of the U.S. Geological Survey (USGS) Mineral Resources Program are providing assessments of the location, quantity, and quality of undiscovered mineral deposits, and predicting the environmental impacts of exploration and mine development. The mineral and environmental assessments of domestic deposits are used by planners and decisionmakers to improve the stewardship of public lands and public resources. Assessments of undiscovered mineral deposits on a global scale reveal the potential availability of minerals to the United States and other countries that manufacture goods imported to the United States. These resources are of fundamental relevance to national and international economic and security policy in our globalized world economy. Performing mineral and environmental assessments requires that predictions be made of the likelihood of undiscovered deposits. The predictions are based on geologic and geoenvironmental models that are constructed for the diverse types of mineral deposits from detailed descriptions of actual deposits and detailed understanding of the processes that formed them. Over the past three decades the understanding of ore-forming processes has benefited greatly from the integration of laboratory-based geochemical tools with field observations and other data sources. Under the aegis of the Evolution of Ore Deposits and Technology Transfer Project (referred to hereinafter as the Project), a 5-year effort that terminated in 2008, the Mineral Resources Program provided state-of-the-art analytical capabilities to support applications of several related geochemical tools to ore-deposit-related studies. The analytical capabilities and scientific approaches developed within the Project have wide applicability within Earth-system science. For this reason the Project Laboratories represent a valuable catalyst for interdisciplinary collaborations of the type that should be formed in the coming years for the United States to meet its natural-resources and natural-science needs. This circular presents an overview of the Project. Descriptions of the Project laboratories are given first including descriptions of the types of chemical or isotopic analyses that are made and the utility of the measurements. This is followed by summaries of select measurements that were carried out by the Project scientists. The studies are grouped by science direction. Virtually all of them were collaborations with USGS colleagues or with scientists from other governmental agencies, academia, or the private sector.
INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ...
Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews
The Mochi project: a field theory approach to plasma dynamics and self-organization
NASA Astrophysics Data System (ADS)
You, Setthivoine; von der Linden, Jens; Lavine, Eric Sander; Card, Alexander; Carroll, Evan
2016-10-01
The Mochi project is designed to study the interaction between plasma flows and magnetic fields from the point-of-view of canonical flux tubes. The Mochi Labjet experiment is being commissioned after achieving first plasma. Analytical and numerical tools are being developed to visualize canonical flux tubes. One analytical tool described here is a field theory approach to plasma dynamics and self-organization. A redefinition of the Lagrangian of a multi-particle system in fields reformulates the single-particle, kinetic, and fluid equations governing fluid and plasma dynamics as a single set of generalized Maxwell's equations and Ohm's law for canonical force-fields. The Lagrangian includes new terms representing the coupling between the motion of particle distributions, between distributions and electromagnetic fields, with relativistic contributions. The formulation shows that the concepts of self-organization and canonical helicity transport are applicable across single-particle, kinetic, and fluid regimes, at classical and relativistic scales. The theory gives the basis for comparing canonical helicity change to energy change in general systems. This work is supported by by US DOE Grant DE-SC0010340.
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation
NASA Astrophysics Data System (ADS)
Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.
2016-03-01
Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.
The Climate Data Analytic Services (CDAS) Framework.
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; Duffy, D.
2016-12-01
Faced with unprecedented growth in climate data volume and demand, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute data processing workflows combining common analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted climate data analysis tools (ESMF, CDAT, NCO, etc.). A dynamic caching architecture enables interactive response times. CDAS utilizes Apache Spark for parallelization and a custom array framework for processing huge datasets within limited memory spaces. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using either direct web service calls, a python script, a unix-like shell client, or a javascript-based web application. Client packages in python, scala, or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends and variability, and compare multiple reanalysis datasets.
ERIC Educational Resources Information Center
Williamson, Nicholas C.
2001-01-01
Describes Export Odyssey (EO), a structured, Internet-intensive, team-based undergraduate student project in international marketing. Presents an analytical review of articles in the literature that relate to three key teaching-learning dimensions of student projects (experiential versus non-experiential active learning, team-based versus…
ERIC Educational Resources Information Center
Nic Giolla Mhichíl, Mairéad; van Engen, Jeroen; Ó Ciardúbháin, Colm; Ó Cléircín, Gearóid; Appel, Christine
2014-01-01
This paper sets out to construct and present the evolving conceptual framework of the SpeakApps projects to consider the application of learning analytics to facilitate synchronous and asynchronous oral language skills within this CALL context. Drawing from both the CALL and wider theoretical and empirical literature of learner analytics, the…
NASA Astrophysics Data System (ADS)
Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.
2016-12-01
A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).
Bussery, Justin; Denis, Leslie-Alexandre; Guillon, Benjamin; Liu, Pengfeï; Marchetti, Gino; Rahal, Ghita
2018-04-01
We describe the genesis, design and evolution of a computing platform designed and built to improve the success rate of biomedical translational research. The eTRIKS project platform was developed with the aim of building a platform that can securely host heterogeneous types of data and provide an optimal environment to run tranSMART analytical applications. Many types of data can now be hosted, including multi-OMICS data, preclinical laboratory data and clinical information, including longitudinal data sets. During the last two years, the platform has matured into a robust translational research knowledge management system that is able to host other data mining applications and support the development of new analytical tools. Copyright © 2018 Elsevier Ltd. All rights reserved.
Advances in Medical Analytics Solutions for Autonomous Medical Operations on Long-Duration Missions
NASA Technical Reports Server (NTRS)
Thompson, David E.; Lindsey, Antonia Edward
2017-01-01
A review will be presented on the progress made under STMDGame Changing Development Program Funding towards the development of a Medical Decision Support System for augmenting crew capabilities during long-duration missions, such as Mars Transit. To create an MDSS, initial work requires acquiring images and developing models that analyze and assess the features in such medical biosensor images that support medical assessment of pathologies. For FY17, the project has focused on ultrasound images towards cardiac pathologies: namely, evaluation and assessment of pericardial effusion identification and discrimination from related pneumothorax and even bladder-induced infections that cause inflammation around the heart. This identification is substantially changed due to uncertainty due to conditions of fluid behavior under space-microgravity. This talk will present and discuss the work-to-date in this Project, recognizing conditions under which various machine learning technologies, deep-learning via convolutional neural nets, and statistical learning methods for feature identification and classification can be employed and conditioned to graphical format in preparation for attachment to an inference engine that eventually creates decision support recommendations to remote crew in a triage setting.
2012-01-01
Background The global initiative ‘Treatment 2.0’ calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. Methods We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Results Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Conclusions Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the ‘Treatment 2.0’ initiative. PMID:23272730
Fujita, Masami; Poudel, Krishna C; Do, Thi Nhan; Bui, Duc Duong; Nguyen, Van Kinh; Green, Kimberly; Nguyen, Thi Minh Thu; Kato, Masaya; Jacka, David; Cao, Thi Thanh Thuy; Nguyen, Thanh Long; Jimba, Masamine
2012-12-29
The global initiative 'Treatment 2.0' calls for expanding the evidence base of optimal HIV service delivery models to maximize HIV case detection and retention in care. However limited systematic assessment has been conducted in countries with concentrated HIV epidemic. We aimed to assess HIV service availability and service connectedness in Vietnam. We developed a new analytical framework of the continuum of prevention and care (COPC). Using the framework, we examined HIV service delivery in Vietnam. Specifically, we analyzed HIV service availability including geographical distribution and decentralization and service connectedness across multiple services and dimensions. We then identified system-related strengths and constraints in improving HIV case detection and retention in care. This was accomplished by reviewing related published and unpublished documents including existing service delivery data. Identified strengths included: decentralized HIV outpatient clinics that offer comprehensive care at the district level particularly in high HIV burden provinces; functional chronic care management for antiretroviral treatment (ART) with the involvement of people living with HIV and the links to community- and home-based care; HIV testing and counseling integrated into tuberculosis and antenatal care services in districts supported by donor-funded projects, and extensive peer outreach networks that reduce barriers for the most-at-risk populations to access services. Constraints included: fragmented local coordination mechanisms for HIV-related health services; lack of systems to monitor the expansion of HIV outpatient clinics that offer comprehensive care; underdevelopment of pre-ART care; insufficient linkage from HIV testing and counseling to pre-ART care; inadequate access to HIV-related services in districts not supported by donor-funded projects particularly in middle and low burden provinces and in mountainous remote areas; and no systematic monitoring of referral services. Our COPC analytical framework was instrumental in identifying system-related strengths and constraints that contribute to HIV case detection and retention in care. The national HIV program plans to strengthen provincial programming by re-defining various service linkages and accelerate the transition from project-based approach to integrated service delivery in line with the 'Treatment 2.0' initiative.
Retaining U.S. Air Force Pilots When the Civilian Demand for Pilots Is Growing
2016-01-01
pilot retention and determine the changes in ARP and AP that could offset those effects. It also simulates the effects of eliminating AP for pilots...array of compensation policies for pilots, thereby providing the USAF with an empirically based analytical platform to determine the special and...greatly from the input and support of our project monitor, Maj Ryan Theiss, Chief, Rated Force Policy-Mobility Forces (HQ USAF/A1PPR), as well as Lt
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Warner, Ethan; Sun, Yongling
The Alternative and Renewable Fuel and Vehicle Technologies Program (ARFVTP) supports a wide range of alternative, low-carbon fuel and vehicle projects in California. This report focuses on two types of ARFVTP benefits. Expected benefits reflect successful deployment of vehicles and fuels supported through program projects. Market transformation benefits represent benefits resulting from project influences on future market conditions to accelerated technology adoption rates. Data collected directly from ARFVTP projects funded from 2009 to first quarter 2014 are used as inputs to the benefits analysis, where possible. Expected benefit estimation methods rely primarily upon project-level data and result in year single-pointmore » estimates within the 2011 to 2025 analysis period. Results suggest that the 178 projects evaluated for expected benefits, representing an investment of $351.3 million in ARFVTP funds, could result in a reduction in petroleum fuel use by 236 million gallons per year and greenhouse gases (GHGs) by 1.7 million metric tonnes carbon dioxide equivalent (MMTCO2e) per year by 2025. Market transformation benefits are described as accruing in addition to expected benefits. They are inherently more uncertain and theoretical than expected benefits, and are therefore reported as high and low ranges, with results suggesting reductions of 1.1 MMTCO2e to 2.5 MMTCO2e per year in GHG reductions and 102 million to 330 million gallons per year in petroleum fuel reductions by 2025. Taking both benefit types into account, results suggest that ARFVTP projects have the potential to make substantial progress toward meeting California's long-term GHG and petroleum fuel use reduction goals. As additional project data become available and market success with alternative and renewable fuels and vehicles grows, the analytic framework relied upon to develop these estimates will become more rigorous and will have a greater capacity to inform future ARFVTP activities.« less
Big data in health care: using analytics to identify and manage high-risk and high-cost patients.
Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel
2014-07-01
The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.
NASA Astrophysics Data System (ADS)
He, Li; Song, Xuan
2018-03-01
In recent years, ceramic fabrication using stereolithography (SLA) has gained in popularity because of its high accuracy and density that can be achieved in the final part of production. One of the key challenges in ceramic SLA is that support structures are required for building overhanging features, whereas removing these support structures without damaging the components is difficult. In this research, a suspension-enclosing projection-stereolithography process is developed to overcome this challenge. This process uses a high-yield-stress ceramic slurry as the feedstock material and exploits the elastic force of the material to support overhanging features without the need for building additional support structures. Ceramic slurries with different solid loadings are studied to identify the rheological properties most suitable for supporting overhanging features. An analytical model of a double doctor-blade module is established to obtain uniform and thin recoating layers from a high-yield-stress slurry. Several test cases highlight the feasibility of using a high-yield-stress slurry to support overhanging features in SLA.
Analytical model for describing ion guiding through capillaries in insulating polymers
NASA Astrophysics Data System (ADS)
Liu, Shi-Dong; Zhao, Yong-Tao; Wang, Yu-Yu; N, Stolterfoht; Cheng, Rui; Zhou, Xian-Ming; Xu, Hu-Shan; Xiao, Guo-Qing
2015-08-01
An analytical description for guiding of ions through nanocapillaries is given on the basis of previous work. The current entering into the capillary is assumed to be divided into a current fraction transmitted through the capillary, a current fraction flowing away via the capillary conductivity and a current fraction remaining within the capillary, which is responsible for its charge-up. The discharging current is assumed to be governed by the Frenkel-Poole process. At higher conductivities the analytical model shows a blocking of the ion transmission, which is in agreement with recent simulations. Also, it is shown that ion blocking observed in experiments is well reproduced by the analytical formula. Furthermore, the asymptotic fraction of transmitted ions is determined. Apart from the key controlling parameter (charge-to-energy ratio), the ratio of the capillary conductivity to the incident current is included in the model. Differences resulting from the nonlinear and linear limits of the Frenkel-Poole discharge are pointed out. Project supported by the Major State Basic Research Development Program of China (Grant No. 2010CB832902) and the National Natural Science Foundation of China (Grant Nos. 11275241, 11275238, 11105192, and 11375034).
Collaborative visual analytics of radio surveys in the Big Data era
NASA Astrophysics Data System (ADS)
Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.
2017-06-01
Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nehrir, M. Hashem
In this Project we collaborated with two DOE National Laboratories, Pacific Northwest National Lab (PNNL) and Lawrence Berkeley National Lab (LBL). Dr. Hammerstrom of PNNL initially supported our project and was on the graduate committee of one of the Ph.D. students (graduated in 2014) who was supported by this project. He is also a committee member of a current graduate student of the PI who was supported by this project in the last two years (August 2014-July 2016). The graduate student is now supported be the Electrical and Computer Engineering (ECE) Department at Montana State University (MSU). Dr. Chris Marneymore » of LBL provided actual load data, and the software WEBOPT developed at LBL for microgrid (MG) design for our project. NEC-Labs America, a private industry, also supported our project, providing expert support and modest financial support. We also used the software “HOMER,” originally developed at the National Renewable Energy Laboratory (NREL) and the most recent version made available to us by HOMER Energy, Inc., for MG (hybrid energy system) unit sizing. We compared the findings from WebOpt and HOMER and designed appropriately sized hybrid systems for our case studies. The objective of the project was to investigate real-time power management strategies for MGs using intelligent control, considering maximum feasible energy sustainability, reliability and efficiency while, minimizing cost and undesired environmental impact (emissions). Through analytic and simulation studies, we evaluated the suitability of several heuristic and artificial-intelligence (AI)-based optimization techniques that had potential for real-time MG power management, including genetic algorithms (GA), ant colony optimization (ACO), particle swarm optimization (PSO), and multi-agent systems (MAS), which is based on the negotiation of smart software-based agents. We found that PSO and MAS, in particular, distributed MAS, were more efficient and better suited for our work. We investigated the following: • Intelligent load control - demand response (DR) - for frequency stabilization in islanded MGs (partially supported by PNNL). • The impact of high penetration of solar photovoltaic (PV)-generated power at the distribution level (partially supported by PNNL). • The application of AI approaches to renewable (wind, PV) power forecasting (proposed by the reviewers of our proposal). • Application of AI approaches and DR for real-time MG power management (partially supported by NEC Labs-America) • Application of DR in dealing with the variability of wind power • Real-time MG power management using DR and storage (partially supported by NEC Labs-America) • Application of DR in enhancing the performance of load-frequency controller • MAS-based whole-sale and retail power market design for smart grid A« less
The Earth Microbiome Project and Global Systems Biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Jack A.; Jansson, Janet K.; Knight, Rob
Recently, we published the first large-scale analysis of data from the Earth Microbiome Project (1, 2), a truly multidisciplinary research program involving more than 500 scientists and 27,751 samples acquired from 43 countries. These samples represent myriad specimen types and span a wide range of biotic and abiotic factors, geographic locations, and physicochemical properties. The database (https://qiita.ucsd.edu/emp/) is still growing, with over 90,000 amplicon datasets, >500 metagenomic runs, and metabolomics datasets from a similar number of samples. Importantly, the techniques, data and analytical tools are all standardized and publicly accessible, providing a framework to support research at a scale ofmore » integration that just 7 years ago seemed impossible.« less
Learning Dilemmas in Undergraduate Student Independent Essays
ERIC Educational Resources Information Center
Wendt, Maria; Åse, Cecilia
2015-01-01
Essay-writing is generally viewed as the primary learning activity to foster independence and analytical thinking. In this article, we show that independent research projects do not necessarily lead to critical thinking. University-level education on conducting independent projects can, in several respects, counteract enhanced analytical skills.…
A Rigorous Investigation on the Ground State of the Penson-Kolb Model
NASA Astrophysics Data System (ADS)
Yang, Kai-Hua; Tian, Guang-Shan; Han, Ru-Qi
2003-05-01
By using either numerical calculations or analytical methods, such as the bosonization technique, the ground state of the Penson-Kolb model has been previously studied by several groups. Some physicists argued that, as far as the existence of superconductivity in this model is concerned, it is canonically equivalent to the negative-U Hubbard model. However, others did not agree. In the present paper, we shall investigate this model by an independent and rigorous approach. We show that the ground state of the Penson-Kolb model is nondegenerate and has a nonvanishing overlap with the ground state of the negative-U Hubbard model. Furthermore, we also show that the ground states of both the models have the same good quantum numbers and may have superconducting long-range order at the same momentum q = 0. Our results support the equivalence between these models. The project partially supported by the Special Funds for Major State Basic Research Projects (G20000365) and National Natural Science Foundation of China under Grant No. 10174002
NASA Technical Reports Server (NTRS)
Rickman, Doug; Shire, J.; Qualters, J.; Mitchell, K.; Pollard, S.; Rao, R.; Kajumba, N.; Quattrochi, D.; Estes, M., Jr.; Meyer, P.;
2009-01-01
Objectives. To provide an overview of four environmental public health surveillance projects developed by CDC and its partners for the Health and Environment Linked for Information Exchange, Atlanta (HELIX-Atlanta) and to illustrate common issues and challenges encountered in developing an environmental public health tracking system. Methods. HELIX-Atlanta, initiated in October 2003 to develop data linkage and analysis methods that can be used by the National Environmental Public Health Tracking Network (Tracking Network), conducted four projects. We highlight the projects' work, assess attainment of the HELIX-Atlanta goals and discuss three surveillance attributes. Results. Among the major challenges was the complexity of analytic issues which required multidiscipline teams with technical expertise. This expertise and the data resided across multiple organizations. Conclusions:Establishing formal procedures for sharing data, defining data analysis standards and automating analyses, and committing staff with appropriate expertise is needed to support wide implementation of environmental public health tracking.
Person-centred web-based support--development through a Swedish multi-case study.
Josefsson, Ulrika; Berg, Marie; Koinberg, Ingalill; Hellström, Anna-Lena; Nolbris, Margaretha Jenholt; Ranerup, Agneta; Lundin, Carina Sparud; Skärsäter, Ingela
2013-10-19
Departing from the widespread use of the internet in modern society and the emerging use of web applications in healthcare this project captures persons' needs and expectations in order to develop highly usable web recourses. The purpose of this paper is to outline a multi-case research project focused on the development and evaluation of person-centred web-based support for people with long-term illness. To support the underlying idea to move beyond the illness, we approach the development of web support from the perspective of the emergent area of person-centred care. The project aims to contribute to the ongoing development of web-based supports in health care and to the emerging field of person-centred care. The research design uses a meta-analytical approach through its focus on synthesizing experiences from four Swedish regional and national cases of design and use of web-based support in long-term illness. The cases include children (bladder dysfunction and urogenital malformation), young adults (living close to persons with mental illness), and two different cases of adults (women with breast cancer and childbearing women with type 1 diabetes). All of the cases are ongoing, though in different stages of design, implementation, and analysis. This, we argue, will lead to a synthesis of results on a meta-level not yet described. To allow valid comparisons between the four cases we explore and problematize them in relation to four main aspects: 1) The use of people's experiences and needs; 2) The role of use of theories in the design of person-centred web-based supports; 3) The evaluation of the effects of health outcomes for the informants involved and 4) The development of a generic person-centred model for learning and social support for people with long-term illness and their significant others. Person-centred web-based support is a new area and few studies focus on how web-based interventions can contribute to the development of person-centred care. In summary, the main intention of the project outlined here is to contribute with both a synthesis of results on meta-level from four cases and a substantial contribution to the field person-centred care.
Mirel, Barbara; Luo, Airong; Harris, Marcelline
2015-05-01
Collaborative research has many challenges. One under-researched challenge is how to align collaborators' research practices and evolving analytical reasoning with technologies and configurations of technologies that best support them. The goal of such alignment is to enhance collaborative problem solving capabilities in research. Toward this end, we draw on our own research and a synthesis of the literature to characterize the workflow of collaborating scientists in systems-level renal disease research. We describe the various phases of a hypothetical workflow among diverse collaborators within and across laboratories, extending from their primary analysis through secondary analysis. For each phase, we highlight required technology supports, and. At time, complementary organizational supports. This survey of supports matching collaborators' analysis practices and needs in research projects to technological support is preliminary, aimed ultimately at developing a research capability framework that can help scientists and technologists mutually understand workflows and technologies that can help enable and enhance them. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coram, Jamie L.; Morrow, James D.; Perkins, David Nikolaus
2015-09-01
This document describes the PANTHER R&D Application, a proof-of-concept user interface application developed under the PANTHER Grand Challenge LDRD. The purpose of the application is to explore interaction models for graph analytics, drive algorithmic improvements from an end-user point of view, and support demonstration of PANTHER technologies to potential customers. The R&D Application implements a graph-centric interaction model that exposes analysts to the algorithms contained within the GeoGraphy graph analytics library. Users define geospatial-temporal semantic graph queries by constructing search templates based on nodes, edges, and the constraints among them. Users then analyze the results of the queries using bothmore » geo-spatial and temporal visualizations. Development of this application has made user experience an explicit driver for project and algorithmic level decisions that will affect how analysts one day make use of PANTHER technologies.« less
NASA Astrophysics Data System (ADS)
Xu, Huifang; Dai, Yuehua
2017-02-01
A two-dimensional analytical model of double-gate (DG) tunneling field-effect transistors (TFETs) with interface trapped charges is proposed in this paper. The influence of the channel mobile charges on the potential profile is also taken into account in order to improve the accuracy of the models. On the basis of potential profile, the electric field is derived and the expression for the drain current is obtained by integrating the BTBT generation rate. The model can be used to study the impact of interface trapped charges on the surface potential, the shortest tunneling length, the drain current and the threshold voltage for varying interface trapped charge densities, length of damaged region as well as the structural parameters of the DG TFET and can also be utilized to design the charge trapped memory devices based on TFET. The biggest advantage of this model is that it is more accurate, and in its expression there are no fitting parameters with small calculating amount. Very good agreements for both the potential, drain current and threshold voltage are observed between the model calculations and the simulated results. Project supported by the National Natural Science Foundation of China (No. 61376106), the University Natural Science Research Key Project of Anhui Province (No. KJ2016A169), and the Introduced Talents Project of Anhui Science and Technology University.
ON-SITE SOLID PHRASE EXTRACTION AND LABORATORY ...
Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of I -L samples. A I -L sample, however, usually provides too little analyte for full-scan data acquisition. An on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent has been developed. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-sol
IN SITU SOLID-PHASE EXTRACTION AND ANALYSIS OF ...
Fragrance materials, such as synthetic musks in aqueous samples, are normally analyzed by GC/MS in the selected ion monitoring (SIM) mode to provide maximum sensitivity after liquid-liquid extraction of 1-L samples. A 1-L sample, however, usually provides too little analyte for full-scan data acquisition.We have developed an on-site extraction method for extracting synthetic musks from 60 L of wastewater effluent. Such a large sample volume permits high-quality, full-scan mass spectra to be obtained for various synthetic musk compounds. Quantification of these compounds was conveniently achieved from the full-scan data directly, without preparing SIM descriptors for each compound to acquire SIM data. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-s
ANALYTICAL CHEMISTRY RESEARCH NEEDS FOR ...
The consensus among environmental scientists and risk assessors is that the fate and effects of pharmaceutical and personal care products (PPCPS) in the environment are poorly understood. Many classes of PPCPs have yet to be investigated. Acquisition of trends data for a suite of PPCPs (representatives from each of numerous significant classes), shown to recur amongst municipal wastewater treatment plants across the country, may prove of key importance. The focus of this paper is an overview of some of the analytical methods being developed at the Environmenental Protection Agency and their application to wastewater and surface water samples. Because PPCPs are generally micro-pollutants, emphasis is on development of enrichment and pre- concentration techniques using various means of solid-phase extraction. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCP
Helios: Understanding Solar Evolution Through Text Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randazzese, Lucien
This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less
D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia
2017-01-01
Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.
Fusion plasma theory project summaries
NASA Astrophysics Data System (ADS)
1993-10-01
This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at U.S. government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the U.S. Fusion Energy Program.
Nurses' perspectives on the care provided to cancer patients.
Watts, Rosemary; Botti, Mari; Hunter, Marion
2010-01-01
Optimal care for patients with cancer involves the provision of effective physical and psychological care. Nurses are key providers of this care; however, the effectiveness of care is dependent on the nurses' training, skills, attitudes, and beliefs. The study reported in this article explored cancer nurses' perceptions of their ability to provide psychosocial care to adults with cancer and their subsequent evaluation of the effectiveness of the care provided. This study was the first part of a larger project that evaluated the effectiveness of Proctor's model of clinical supervision in an acute care oncology environment. An exploratory qualitative design was used for this study. One focus group interview was conducted with 10 randomly selected registered nurses working within the oncology units at a major Melbourne tertiary referral hospital. Analytic themes were developed from the coded data using content analysis. The 4 analytic themes to emerge from the data were frustration, difficult to look after yourself, inadequate communication processes, and anger. The findings from this study indicate that, although informal mechanisms of support are available for oncology nurses, most of these services are not accessed. Leaders in cancer care hospital settings need to urgently develop and implement a model of support for their oncology nurses who are attempting to provide psychosocial support to oncology patients.
McCahill, Peter W; Noste, Erin E; Rossman, A J; Callaway, David W
2014-12-01
Disasters create major strain on energy infrastructure in affected communities. Advances in microgrid technology offer the potential to improve "off-grid" mobile disaster medical response capabilities beyond traditional diesel generation. The Carolinas Medical Center's mobile emergency medical unit (MED-1) Green Project (M1G) is a multi-phase project designed to demonstrate the benefits of integrating distributive generation (DG), high-efficiency batteries, and "smart" energy utilization in support of major out-of-hospital medical response operations. Carolinas MED-1 is a mobile medical facility composed of a fleet of vehicles and trailers that provides comprehensive medical care capacities to support disaster response and special-event operations. The M1G project partnered with local energy companies to deploy energy analytics and an energy microgrid in support of mobile clinical operations for the 2012 Democratic National Convention (DNC) in Charlotte, North Carolina (USA). Energy use data recorded throughout the DNC were analyzed to create energy utilization models that integrate advanced battery technology, solar photovoltaic (PV), and energy conservation measures (ECM) to improve future disaster response operations. The generators that supply power for MED-1 have a minimum loading ratio (MLR) of 30 kVA. This means that loads below 30 kW lead to diesel fuel consumption at the same rate as a 30 kW load. Data gathered from the two DNC training and support deployments showed the maximum load of MED-1 to be around 20 kW. This discrepancy in MLR versus actual load leads to significant energy waste. The lack of an energy storage system reduces generator efficiency and limits integration of alternative energy generation strategies. A storage system would also allow for alternative generation sources, such as PV, to be incorporated. Modeling with a 450 kWh battery bank and 13.5 kW PV array showed a 2-fold increase in potential deployment times using the same amount of fuel versus the current conventional system. The M1G Project demonstrated that the incorporation of a microgrid energy management system and a modern battery system maximize the MED-1 generators' output. Using a 450 kWh battery bank and 13.5 kW PV array, deployment operations time could be more than doubled before refueling. This marks a dramatic increase in patient care capabilities and has significant public health implications. The results highlight the value of smart-microgrid technology in developing energy independent mobile medical capabilities and expanding cost-effective, high-quality medical response.
Quality indicators in laboratory medicine: a fundamental tool for quality and patient safety.
Plebani, Mario; Sciacovelli, Laura; Marinova, Mariela; Marcuccitti, Jessica; Chiozza, Maria Laura
2013-09-01
The identification of reliable quality indicators (QIs) is a crucial step in enabling users to quantify the quality of laboratory services. The current lack of attention to extra-laboratory factors is in stark contrast with the body of evidence pointing to the multitude of errors that continue to occur in the pre- and post-analytical phases. Different QIs and terminologies are currently used and, therefore, there is the need to harmonize proposed QIs. A model of quality indicators (MQI) has been consensually developed by a group of clinical laboratories according to a project launched by a working group of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). The model includes 57 QIs related to key processes (35 pre-, 7 intra- and 15 post-analytical phases) and 3 to support processes. The developed MQI and the data collected provide evidence of the feasibility of the project to harmonize currently available QIs, but further efforts should be done to involve more clinical laboratories and to collect a more consistent amount of data. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Preparation of Morpheus Vehicle for Vacuum Environment Testing
NASA Technical Reports Server (NTRS)
Sandoval, Armando
2016-01-01
The main objective for this summer 2016 tour was to prepare the Morpheus vehicle for its upcoming test inside Plum Brook's vacuum chamber at NASA John H. Glenn Research Center. My contributions towards this project were mostly analytical in nature, providing numerical models to validate test data, generating computer aided analyses for the structure support of the vehicle's engine, and designing a vacuum can that is to protect the high speed camera used during testing. Furthermore, I was also tasked with designing a tank toroidal spray bar system.
Hot-spot investigations of utility scale panel configurations
NASA Technical Reports Server (NTRS)
Arnett, J. C.; Dally, R. B.; Rumburg, J. P.
1984-01-01
The causes of array faults and efforts to mitigate their effects are examined. Research is concentrated on the panel for the 900 kw second phase of the Sacramento Municipal Utility District (SMUD) project. The panel is designed for hot spot tolerance without comprising efficiency under normal operating conditions. Series/paralleling internal to each module improves tolerance in the power quadrant to cell short or open circuits. Analtyical methods are developed for predicting worst case shade patterns and calculating the resultant cell temperature. Experiments conducted on a prototype panel support the analytical calculations.
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
Web Analytics Reveal User Behavior: TTU Libraries' Experience with Google Analytics
ERIC Educational Resources Information Center
Barba, Ian; Cassidy, Ryan; De Leon, Esther; Williams, B. Justin
2013-01-01
Proper planning and assessment surveys of projects for academic library Web sites will not always be predictive of real world use, no matter how many responses they might receive. In this case, multiple-phase development, librarian focus groups, and patron surveys performed before implementation of such a project inaccurately overrated utility and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Warner, Ethan; Sun, Yongling
The Alternative and Renewable Fuel and Vehicle Technologies Program (ARFVTP) supports a wide range of alternative, low-carbon fuel and vehicle projects in California. This report focuses on two types of ARFVTP benefits. Expected benefits reflect successful deployment of vehicles and fuels supported through program projects. Market transformation benefits represent benefits resulting from project influences on future market conditions to accelerated technology adoption rates. Data collected directly from ARFVTP projects funded from 2009 to first quarter 2014 are used as inputs to the benefits analysis, where possible. Expected benefit estimation methods rely primarily upon project-level data and result in year single-pointmore » estimates within the 2011 to 2025 analysis period. Results suggest that the 178 projects evaluated for expected benefits, representing an investment of $351.3 million in ARFVTP funds, could result in a reduction in petroleum fuel use by 236 million gallons per year and greenhouse gases (GHGs) by 1.7 million metric tonnes carbon dioxide equivalent (MMTCO2e) per year by 2025. Market transformation benefits are described as accruing in addition to expected benefits. They are inherently more uncertain and theoretical than expected benefits, and are therefore reported as high and low ranges, with results suggesting reductions of 1.1 MMTCO2e to 2.5 MMTCO2e per year in GHG reductions and 102 million to 330 million gallons per year in petroleum fuel reductions by 2025. Taking both benefit types into account, results suggest that ARFVTP projects have the potential to make substantial progress toward meeting California's long-term GHG and petroleum fuel use reduction goals. As additional project data become available and market success with alternative and renewable fuels and vehicles grows, the analytic framework relied upon to develop these estimates will become more rigorous and will have a greater capacity to inform future ARFVTP activities.« less
VAST Challenge 2016: Streaming Visual Analytics
2016-10-25
understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to
ERIC Educational Resources Information Center
Jolley, Dianne F.; Wilson, Stephen R.; Kelso, Celine; O'Brien, Glennys; Mason, Claire E.
2016-01-01
This project utilizes visual and critical thinking approaches to develop a higher-education synergistic prelab training program for a large second-year undergraduate analytical chemistry class, directing more of the cognitive learning to the prelab phase. This enabled students to engage in more analytical thinking prior to engaging in the…
Neumann, Cedric; Ramotowski, Robert; Genessay, Thibault
2011-05-13
Forensic examinations of ink have been performed since the beginning of the 20th century. Since the 1960s, the International Ink Library, maintained by the United States Secret Service, has supported those analyses. Until 2009, the search and identification of inks were essentially performed manually. This paper describes the results of a project designed to improve ink samples' analytical and search processes. The project focused on the development of improved standardization procedures to ensure the best possible reproducibility between analyses run on different HPTLC plates. The successful implementation of this new calibration method enabled the development of mathematical algorithms and of a software package to complement the existing ink library. Copyright © 2010 Elsevier B.V. All rights reserved.
IPEDS Analytics: Delta Cost Project Database 1987-2010. Data File Documentation. NCES 2012-823
ERIC Educational Resources Information Center
Lenihan, Colleen
2012-01-01
The IPEDS Analytics: Delta Cost Project Database was created to make data from the Integrated Postsecondary Education Data System (IPEDS) more readily usable for longitudinal analyses. Currently spanning the period from 1987 through 2010, it has a total of 202,800 observations on 932 variables derived from the institutional characteristics,…
Assessment of Learning in Digital Interactive Social Networks: A Learning Analytics Approach
ERIC Educational Resources Information Center
Wilson, Mark; Gochyyev, Perman; Scalise, Kathleen
2016-01-01
This paper summarizes initial field-test results from data analytics used in the work of the Assessment and Teaching of 21st Century Skills (ATC21S) project, on the "ICT Literacy--Learning in digital networks" learning progression. This project, sponsored by Cisco, Intel and Microsoft, aims to help educators around the world enable…
A Big Data Analytics Methodology Program in the Health Sector
ERIC Educational Resources Information Center
Lawler, James; Joseph, Anthony; Howell-Barber, H.
2016-01-01
The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…
Temporal abstraction-based clinical phenotyping with Eureka!
Post, Andrew R; Kurc, Tahsin; Willard, Richie; Rathod, Himanshu; Mansour, Michel; Pai, Akshatha Kalsanka; Torian, William M; Agravat, Sanjay; Sturm, Suzanne; Saltz, Joel H
2013-01-01
Temporal abstraction, a method for specifying and detecting temporal patterns in clinical databases, is very expressive and performs well, but it is difficult for clinical investigators and data analysts to understand. Such patterns are critical in phenotyping patients using their medical records in research and quality improvement. We have previously developed the Analytic Information Warehouse (AIW), which computes such phenotypes using temporal abstraction but requires software engineers to use. We have extended the AIW's web user interface, Eureka! Clinical Analytics, to support specifying phenotypes using an alternative model that we developed with clinical stakeholders. The software converts phenotypes from this model to that of temporal abstraction prior to data processing. The model can represent all phenotypes in a quality improvement project and a growing set of phenotypes in a multi-site research study. Phenotyping that is accessible to investigators and IT personnel may enable its broader adoption.
Waska, R T
1999-01-01
Certain patients, through projective identification and splitting mechanisms, test the boundaries of the analytic situation. These patients are usually experiencing overwhelming paranoid-schizoid anxieties and view the object as ruthless and persecutory. Using a Kleinian perspective, the author advocates greater analytic flexibility with these difficult patients who seem unable to use the standard analytic environment. The concept of self-disclosure is examined, and the author discusses certain technical situations where self-disclosure may be helpful. (The Journal of Psychotherapy Practice and Research 1999; 8:225-233)
Phonon dispersion on Ag (100) surface: A modified analytic embedded atom method study
NASA Astrophysics Data System (ADS)
Xiao-Jun, Zhang; Chang-Le, Chen
2016-01-01
Within the harmonic approximation, the analytic expression of the dynamical matrix is derived based on the modified analytic embedded atom method (MAEAM) and the dynamics theory of surface lattice. The surface phonon dispersions along three major symmetry directions , and X¯M¯ are calculated for the clean Ag (100) surface by using our derived formulas. We then discuss the polarization and localization of surface modes at points X¯ and M¯ by plotting the squared polarization vectors as a function of the layer index. The phonon frequencies of the surface modes calculated by MAEAM are compared with the available experimental and other theoretical data. It is found that the present results are generally in agreement with the referenced experimental or theoretical results, with a maximum deviation of 10.4%. The agreement shows that the modified analytic embedded atom method is a reasonable many-body potential model to quickly describe the surface lattice vibration. It also lays a significant foundation for studying the surface lattice vibration in other metals. Project supported by the National Natural Science Foundation of China (Grant Nos. 61471301 and 61078057), the Scientific Research Program Funded by Shaanxi Provincial Education Department, China (Grant No. 14JK1301), and the Specialized Research Fund for the Doctoral Program of Higher Education, China (Grant No. 20126102110045).
1987-09-01
A187 899 A GOAL PROGRANNIN R&D (RESEARCH AND DEVELOPMENT) 1/2 PROJECT FUNDING MODEL 0 (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CA S M ANDERSON SEP 87...PROGRAMMING R&D PROJECT FUNDING MODEL OF THE U.S. ARMY STRATEGIC DEFENSE COMMAND USING THE ANALYTIC HIERARCHY PROCESS by Steven M. Anderson September 1987...jACCESSION NO TITI E (Influde Securt ClauAIcatsrn) A Goal Programming R&D Project Funding Model of the U.S. Army Strategic Defense Command Using the
KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery
NASA Astrophysics Data System (ADS)
Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan
2013-05-01
KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.
Cost-effectiveness of the Health X Project for tuberculosis control in China.
Wang, W-B; Zhang, H; Petzold, M; Zhao, Q; Xu, B; Zhao, G-M
2014-08-01
Between 2002 and 2008, China's National Tuberculosis Control Programme created the Health X Project, financed in part by a World Bank loan, with additional funding from the UK Department for International Development. To assess the cost-effectiveness of the Project and its impact from a financial point of view on tuberculosis (TB) control in China. A decision-analytic model was used to evaluate the cost-effectiveness of the Project. Sensitivity analysis was used to assess the impact of different scenarios and assumptions on results. The primary outcome of the study was cost per disability-adjusted life-year (DALY) saved and incremental DALYs saved. In comparison with alternative scenario 1, the Project detected 1.6 million additional cases, 44 000 deaths were prevented and a total of 18.4 million DALYs saved. The Project strategies cost approximately Chinese yuan (CNY) 953 per DALY saved (vs. CNY1140 in the control areas), and saved an estimated CNY17.5 billion in comparison with the unchanged alternative scenario (scenario 1) or CNY10.8 billion with the control scenario (scenario 2). The Project strategies were affordable and of comparable cost-effectiveness to those of other developing countries. The results also provide strong support for the existing policy of scaling up DOTS in China.
Negotiating Story Entry: A Micro-Analytic Study of Storytelling Projection in English and Japanese
ERIC Educational Resources Information Center
Yasui, Eiko
2011-01-01
This dissertation offers a micro-analytic study of the use of language and body during storytelling in American English and Japanese conversations. Specifically, I focus on its beginning and explore how a story is "projected." A beginning of an action or activity is where an incipient speaker negotiates the floor with co-participants; they…
Signals: Applying Academic Analytics
ERIC Educational Resources Information Center
Arnold, Kimberly E.
2010-01-01
Academic analytics helps address the public's desire for institutional accountability with regard to student success, given the widespread concern over the cost of higher education and the difficult economic and budgetary conditions prevailing worldwide. Purdue University's Signals project applies the principles of analytics widely used in…
Do knowledge translation (KT) plans help to structure KT practices?
Tchameni Ngamo, Salomon; Souffez, Karine; Lord, Catherine; Dagenais, Christian
2016-06-17
A knowledge translation (KT) planning template is a roadmap laying out the core elements to be considered when structuring the implementation of KT activities by researchers and practitioners. Since 2010, the Institut national de santé publique du Québec (INSPQ; Québec Public Health Institute) has provided tools and guidance to in-house project teams to help them develop KT plans. This study sought to identify the dimensions included in those plans and which ones were integrated and how. The results will be of interest to funding agencies and scientific organizations that provide frameworks for KT planning. The operationalization of KT planning dimensions was assessed in a mixed methods case study of 14 projects developed at the INSPQ between 2010 and 2013. All plans were assessed (rated) using an analytical tool developed for this study and data from interviews with the planning coordinators. The analytical tool and interview guide were based on eight core KT dimensions identified in the literature. Analysis of the plans and interviews revealed that the dimensions best integrated into the KT plans were 'analysis of the context (barriers and facilitators) and of users' needs', 'knowledge to be translated', 'KT partners', 'KT strategies' and, to a lesser extent, 'overall KT approach'. The least well integrated dimensions were 'knowledge about knowledge users', 'KT process evaluation' and 'resources'. While the planning coordinators asserted that a plan did not need to include all the dimensions to ensure its quality and success, nevertheless the dimensions that received less attention might have been better incorporated if they had been supported with more instruments related to those dimensions and sustained methodological guidance. Overall, KT planning templates appear to be an appreciated mechanism for supporting KT reflexive practices. Based on this study and our experience, we recommend using KT plans cautiously when assessing project efficacy and funding.
Sedimentary Geothermal Feasibility Study: October 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad; Zerpa, Luis
The objective of this project is to analyze the feasibility of commercial geothermal projects using numerical reservoir simulation, considering a sedimentary reservoir with low permeability that requires productivity enhancement. A commercial thermal reservoir simulator (STARS, from Computer Modeling Group, CMG) is used in this work for numerical modeling. In the first stage of this project (FY14), a hypothetical numerical reservoir model was developed, and validated against an analytical solution. The following model parameters were considered to obtain an acceptable match between the numerical and analytical solutions: grid block size, time step and reservoir areal dimensions; the latter related to boundarymore » effects on the numerical solution. Systematic model runs showed that insufficient grid sizing generates numerical dispersion that causes the numerical model to underestimate the thermal breakthrough time compared to the analytic model. As grid sizing is decreased, the model results converge on a solution. Likewise, insufficient reservoir model area introduces boundary effects in the numerical solution that cause the model results to differ from the analytical solution.« less
Ortíz, Miguel A; Felizzola, Heriberto A; Nieto Isaza, Santiago
2015-01-01
The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare.
2015-01-01
Background The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. Methods ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. Results The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. Conclusions ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare. PMID:26391445
PESTICIDE ANALYTICAL METHODS TO SUPPORT DUPLICATE-DIET HUMAN EXPOSURE MEASUREMENTS
Historically, analytical methods for determination of pesticides in foods have been developed in support of regulatory programs and are specific to food items or food groups. Most of the available methods have been developed, tested and validated for relatively few analytes an...
Person-centred web-based support - development through a Swedish multi-case study
2013-01-01
Background Departing from the widespread use of the internet in modern society and the emerging use of web applications in healthcare this project captures persons’ needs and expectations in order to develop highly usable web recourses. The purpose of this paper is to outline a multi-case research project focused on the development and evaluation of person-centred web-based support for people with long-term illness. To support the underlying idea to move beyond the illness, we approach the development of web support from the perspective of the emergent area of person-centred care. The project aims to contribute to the ongoing development of web-based supports in health care and to the emerging field of person-centred care. Methods/Design The research design uses a meta-analytical approach through its focus on synthesizing experiences from four Swedish regional and national cases of design and use of web-based support in long-term illness. The cases include children (bladder dysfunction and urogenital malformation), young adults (living close to persons with mental illness), and two different cases of adults (women with breast cancer and childbearing women with type 1 diabetes). All of the cases are ongoing, though in different stages of design, implementation, and analysis. This, we argue, will lead to a synthesis of results on a meta-level not yet described. Discussion To allow valid comparisons between the four cases we explore and problematize them in relation to four main aspects: 1) The use of people’s experiences and needs; 2) The role of use of theories in the design of person-centred web-based supports; 3) The evaluation of the effects of health outcomes for the informants involved and 4) The development of a generic person-centred model for learning and social support for people with long-term illness and their significant others. Person-centred web-based support is a new area and few studies focus on how web-based interventions can contribute to the development of person-centred care. In summary, the main intention of the project outlined here is to contribute with both a synthesis of results on meta-level from four cases and a substantial contribution to the field person-centred care. PMID:24139057
Application of Fuzzy Analytic Hierarchy Process to Building Research Teams
NASA Astrophysics Data System (ADS)
Dąbrowski, Karol; Skrzypek, Katarzyna
2016-03-01
Building teams has a fundamental impact for execution of research and development projects. The teams appointed for the needs of given projects are based on individuals from both inside and outside of the organization. Knowledge is not only a product available on the market but also an intangible resource affecting their internal and external processes. Thus it is vitally important for businesses and scientific research facilities to effectively manage knowledge within project teams. The article presents a proposal to use Fuzzy AHP (Analytic Hierarchy Process) and ANFIS (Adaptive Neuro Fuzzy Inference System) methods in working groups building for R&D projects on the basis of employees skills.
Using Google Analytics to evaluate the impact of the CyberTraining project.
McGuckin, Conor; Crowley, Niall
2012-11-01
A focus on results and impact should be at the heart of every project's approach to research and dissemination. This article discusses the potential of Google Analytics (GA: http://google.com/analytics ) as an effective resource for measuring the impact of academic research output and understanding the geodemographics of users of specific Web 2.0 content (e.g., intervention and prevention materials, health promotion and advice). This article presents the results of GA analyses as a resource used in measuring the impact of the EU-funded CyberTraining project, which provided a well-grounded, research-based training manual on cyberbullying for trainers through the medium of a Web-based eBook ( www.cybertraining-project.org ). The training manual includes review information on cyberbullying, its nature and extent across Europe, analyses of current projects, and provides resources for trainers working with the target groups of pupils, parents, teachers, and other professionals. Results illustrate the promise of GA as an effective tool for measuring the impact of academic research and project output with real potential for tracking and understanding intra- and intercountry regional variations in the uptake of prevention and intervention materials, thus enabling precision focusing of attention to those regions.
Raskob, Wolfgang; Schneider, Thierry; Gering, Florian; Charron, Sylvie; Zhelezniak, Mark; Andronopoulos, Spyros; Heriard-Dubreuil, Gilles; Camps, Johan
2015-04-01
The PREPARE project that started in February 2013 and will end at the beginning of 2016 aims to close gaps that have been identified in nuclear and radiological preparedness in Europe following the first evaluation of the Fukushima disaster. Among others, the project will address the review of existing operational procedures for dealing with long-lasting releases and cross-border problems in radiation monitoring and food safety and further develop missing functionalities in decision support systems (DSS) ranging from improved source-term estimation and dispersion modelling to the inclusion of hydrological pathways for European water bodies. In addition, a so-called Analytical Platform will be developed exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. The tools developed within the project will be partly integrated into the two DSS ARGOS and RODOS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Knowledge Engineering as a Component of the Curriculum for Medical Cybernetists.
Karas, Sergey; Konev, Arthur
2017-01-01
According to a new state educational standard, students who have chosen medical cybernetics as their major must develop a knowledge engineering competency. Previously, in the course "Clinical cybernetics" while practicing project-based learning students were designing automated workstations for medical personnel using client-server technology. The purpose of the article is to give insight into the project of a new educational module "Knowledge engineering". Students will acquire expert knowledge by holding interviews and conducting surveys, and then they will formalize it. After that, students will form declarative expert knowledge in a network model and analyze the knowledge graph. Expert decision making methods will be applied in software on the basis of a production model of knowledge. Project implementation will result not only in the development of analytical competencies among students, but also creation of a practically useful expert system based on student models to support medical decisions. Nowadays, this module is being tested in the educational process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
EISLER, G. RICHARD
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstratemore » the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.« less
Waska, Robert T.
1999-01-01
Certain patients, through projective identification and splitting mechanisms, test the boundaries of the analytic situation. These patients are usually experiencing overwhelming paranoid-schizoid anxieties and view the object as ruthless and persecutory. Using a Kleinian perspective, the author advocates greater analytic flexibility with these difficult patients who seem unable to use the standard analytic environment. The concept of self-disclosure is examined, and the author discusses certain technical situations where self-disclosure may be helpful.(The Journal of Psychotherapy Practice and Research 1999; 8:225–233) PMID:10413442
Environmental Assessment of the Hawaii Geothermal Project Well Flow Test Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1976-11-01
The Hawaii Geothermal Project, a coordinated research effort of the University of Hawaii, funded by the County and State of Hawaii, and ERDA, was initiated in 1973 in an effort to identify, generate, and use geothermal energy on the Big Island of Hawaii. A number of stages are involved in developing geothermal power resources: exploration, test drilling, production testing, field development, power plant and powerline construction, and full-scale production. Phase I of the Project, which began in the summer of 1973, involved conducting exploratory surveys, developing analytical models for interpretation of geophysical results, conducting studies on energy recovery from hotmore » brine, and examining the legal and economic implications of developing geothermal resources in the state. Phase II of the Project, initiated in the summer of 1975, centers on drilling an exploratory research well on the Island of Hawaii, but also continues operational support for the geophysical, engineering, and socioeconomic activities delineated above. The project to date is between the test drilling and production testing phase. The purpose of this assessment is to describe the activities and potential impacts associated with extensive well flow testing to be completed during Phase II.« less
Thomopoulos, N; Grant-Muller, S; Tight, M R
2009-11-01
Interest has re-emerged on the issue of how to incorporate equity considerations in the appraisal of transport projects and large road infrastructure projects in particular. This paper offers a way forward in addressing some of the theoretical and practical concerns that have presented difficulties to date in incorporating equity concerns in the appraisal of such projects. Initially an overview of current practice within transport regarding the appraisal of equity considerations in Europe is offered based on an extensive literature review. Acknowledging the value of a framework approach, research towards introducing a theoretical framework is then presented. The proposed framework is based on the well established MCA Analytic Hierarchy Process and is also contrasted with the use of a CBA based approach. The framework outlined here offers an additional support tool to decision makers who will be able to differentiate choices based on their views on specific equity principles and equity types. It also holds the potential to become a valuable tool for evaluators as a result of the option to assess predefined equity perspectives of decision makers against both the project objectives and the estimated project impacts. This framework may also be of further value to evaluators outside transport.
Den Hartog, Emiel A; Havenith, George
2010-01-01
For wearers of protective clothing in radiation environments there are no quantitative guidelines available for the effect of a radiative heat load on heat exchange. Under the European Union funded project ThermProtect an analytical effort was defined to address the issue of radiative heat load while wearing protective clothing. As within the ThermProtect project much information has become available from thermal manikin experiments in thermal radiation environments, these sets of experimental data are used to verify the analytical approach. The analytical approach provided a good prediction of the heat loss in the manikin experiments, 95% of the variance was explained by the model. The model has not yet been validated at high radiative heat loads and neglects some physical properties of the radiation emissivity. Still, the analytical approach provides a pragmatic approach and may be useful for practical implementation in protective clothing standards for moderate thermal radiation environments.
A development of logistics management models for the Space Transportation System
NASA Technical Reports Server (NTRS)
Carrillo, M. J.; Jacobsen, S. E.; Abell, J. B.; Lippiatt, T. F.
1983-01-01
A new analytic queueing approach was described which relates stockage levels, repair level decisions, and the project network schedule of prelaunch operations directly to the probability distribution of the space transportation system launch delay. Finite source population and limited repair capability were additional factors included in this logistics management model developed specifically for STS maintenance requirements. Data presently available to support logistics decisions were based on a comparability study of heavy aircraft components. A two-phase program is recommended by which NASA would implement an integrated data collection system, assemble logistics data from previous STS flights, revise extant logistics planning and resource requirement parameters using Bayes-Lin techniques, and adjust for uncertainty surrounding logistics systems performance parameters. The implementation of these recommendations can be expected to deliver more cost-effective logistics support.
Granitto, Matthew; DeWitt, Ed H.; Klein, Terry L.
2010-01-01
This database was initiated, designed, and populated to collect and integrate geochemical data from central Colorado in order to facilitate geologic mapping, petrologic studies, mineral resource assessment, definition of geochemical baseline values and statistics, environmental impact assessment, and medical geology. The Microsoft Access database serves as a geochemical data warehouse in support of the Central Colorado Assessment Project (CCAP) and contains data tables describing historical and new quantitative and qualitative geochemical analyses determined by 70 analytical laboratory and field methods for 47,478 rock, sediment, soil, and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed either in the analytical laboratories of the USGS or by contract with commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects. In addition, geochemical data from 7,470 sediment and soil samples collected and analyzed under the Atomic Energy Commission National Uranium Resource Evaluation (NURE) Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) program (henceforth called NURE) have been included in this database. In addition to data from 2,377 samples collected and analyzed under CCAP, this dataset includes archived geochemical data originally entered into the in-house Rock Analysis Storage System (RASS) database (used by the USGS from the mid-1960s through the late 1980s) and the in-house PLUTO database (used by the USGS from the mid-1970s through the mid-1990s). All of these data are maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB and from the NURE database were used to generate most of this dataset. In addition, USGS data that have been excluded previously from the NGDB because the data predate earliest USGS geochemical databases, or were once excluded for programmatic reasons, have been included in the CCAP Geochemical Database and are planned to be added to the NGDB.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, H
Purpose: This work is to develop a general framework, namely filtered iterative reconstruction (FIR) method, to incorporate analytical reconstruction (AR) method into iterative reconstruction (IR) method, for enhanced CT image quality. Methods: FIR is formulated as a combination of filtered data fidelity and sparsity regularization, and then solved by proximal forward-backward splitting (PFBS) algorithm. As a result, the image reconstruction decouples data fidelity and image regularization with a two-step iterative scheme, during which an AR-projection step updates the filtered data fidelity term, while a denoising solver updates the sparsity regularization term. During the AR-projection step, the image is projected tomore » the data domain to form the data residual, and then reconstructed by certain AR to a residual image which is in turn weighted together with previous image iterate to form next image iterate. Since the eigenvalues of AR-projection operator are close to the unity, PFBS based FIR has a fast convergence. Results: The proposed FIR method is validated in the setting of circular cone-beam CT with AR being FDK and total-variation sparsity regularization, and has improved image quality from both AR and IR. For example, AIR has improved visual assessment and quantitative measurement in terms of both contrast and resolution, and reduced axial and half-fan artifacts. Conclusion: FIR is proposed to incorporate AR into IR, with an efficient image reconstruction algorithm based on PFBS. The CBCT results suggest that FIR synergizes AR and IR with improved image quality and reduced axial and half-fan artifacts. The authors was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000), and the Shanghai Pujiang Talent Program (#14PJ1404500).« less
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration
NASA Technical Reports Server (NTRS)
Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David;
2017-01-01
The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.
Power law of shear viscosity in Einstein-Maxwell-Dilaton-Axion model
NASA Astrophysics Data System (ADS)
Ling, Yi; Xian, Zhuoyu; Zhou, Zhenhua
2017-02-01
We construct charged black hole solutions with hyperscaling violation in the infrared (IR) region in Einstein-Maxwell-Dilaton-Axion theory and investigate the temperature behavior of the ratio of holographic shear viscosity to the entropy density. When translational symmetry breaking is relevant in the IR, the power law of the ratio is verified numerically at low temperature T, namely, η/s ˜ T κ , where the values of exponent κ coincide with the analytical results. We also find that the exponent κ is not affected by irrelevant current, but is reduced by the relevant current. Supported by National Natural Science Foundation of China (11275208, 11575195), Opening Project of Shanghai Key Laboratory of High Temperature Superconductors (14DZ2260700) and Jiangxi Young Scientists (JingGang Star) Program and 555 Talent Project of Jiangxi Province
GRACKLE: a chemistry and cooling library for astrophysics
NASA Astrophysics Data System (ADS)
Smith, Britton D.; Bryan, Greg L.; Glover, Simon C. O.; Goldbaum, Nathan J.; Turk, Matthew J.; Regan, John; Wise, John H.; Schive, Hsi-Yu; Abel, Tom; Emerick, Andrew; O'Shea, Brian W.; Anninos, Peter; Hummels, Cameron B.; Khochfar, Sadegh
2017-04-01
We present the GRACKLE chemistry and cooling library for astrophysical simulations and models. GRACKLE provides a treatment of non-equilibrium primordial chemistry and cooling for H, D and He species, including H2 formation on dust grains; tabulated primordial and metal cooling; multiple ultraviolet background models; and support for radiation transfer and arbitrary heat sources. The library has an easily implementable interface for simulation codes written in C, C++ and FORTRAN as well as a PYTHON interface with added convenience functions for semi-analytical models. As an open-source project, GRACKLE provides a community resource for accessing and disseminating astrochemical data and numerical methods. We present the full details of the core functionality, the simulation and PYTHON interfaces, testing infrastructure, performance and range of applicability. GRACKLE is a fully open-source project and new contributions are welcome.
Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP)
The Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP) provides guidance for the planning, implementation and assessment phases of projects that require laboratory analysis of radionuclides.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent
Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less
The Independent Technical Analysis Process Final Report 2006-2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duberstein, Corey; Ham, Kenneth; Dauble, Dennis
2007-03-01
The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities. The Independent Technical Analysis Process (ITAP) was created to provide non-routine analysis for fish and wildlife agencies and tribes in particular and the public in general on matters related tomore » juvenile and adult salmon and steelhead passage through the mainstem hydrosystem. The process was designed to maintain the independence of analysts and reviewers from parties requesting analyses, to avoid potential bias in technical products. The objectives identified for this project were to administer a rigorous, transparent process to deliver unbiased technical assistance necessary to coordinate recommendations for storage reservoir and river operations that avoid potential conflicts between anadromous and resident fish. Seven work elements, designated by numbered categories in the Pisces project tracking system, were created to define and accomplish project goals as follows: (1) 118 Coordination - Coordinate technical analysis and review process: (a) Retain expertise for analyst/reviewer roles. (b) Draft research directives. (c) Send directive to the analyst. (d) Coordinate two independent reviews of the draft report. (e) Ensure reviewer comments are addressed within the final report. (2) 162 Analyze/Interpret Data - Implement the independent aspects of the project. (3) 122 Provide Technical Review - Implement the review process for the analysts. (4) 132 Produce Annual Report - FY06 annual progress report with Pisces Disseminate (5) 161 Disseminate Raw/Summary Data and Results - Post technical products on the ITAP web site. (6) 185-Produce Pisces Status Report - Provide periodic status reports to BPA. (7) 119 Manage and Administer Projects - project/contract administration.« less
Labour Market Driven Learning Analytics
ERIC Educational Resources Information Center
Kobayashi, Vladimer; Mol, Stefan T.; Kismihók, Gábor
2014-01-01
This paper briefly outlines a project about integrating labour market information in a learning analytics goal-setting application that provides guidance to students in their transition from education to employment.
NASA's Cryogenic Fluid Management Technology Project
NASA Technical Reports Server (NTRS)
Tramel, Terri L.; Motil, Susan M.
2008-01-01
The Cryogenic Fluid Management (CFM) Project's primary objective is to develop storage, transfer, and handling technologies for cryogens that will support the enabling of high performance cryogenic propulsion systems, lunar surface systems and economical ground operations. Such technologies can significantly reduce propellant launch mass and required on-orbit margins, reduce or even eliminate propellant tank fluid boil-off losses for long term missions, and simplify vehicle operations. This paper will present the status of the specific technologies that the CFM Project is developing. The two main areas of concentration are analysis models development and CFM hardware development. The project develops analysis tools and models based on thermodynamics, hydrodynamics, and existing flight/test data. These tools assist in the development of pressure/thermal control devices (such as the Thermodynamic Vent System (TVS), and Multi-layer insulation); with the ultimate goal being to develop a mature set of tools and models that can characterize the performance of the pressure/thermal control devices incorporated in the design of an entire CFM system with minimal cryogen loss. The project does hardware development and testing to verify our understanding of the physical principles involved, and to validate the performance of CFM components, subsystems and systems. This database provides information to anchor our analytical models. This paper describes some of the current activities of the NASA's Cryogenic Fluid Management Project.
A New Analytic-Adaptive Model for EGS Assessment, Development and Management Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danko, George L
To increase understanding of the energy extraction capacity of Enhanced Geothermal System(s) (EGS), a numerical model development and application project is completed. The general objective of the project is to develop and apply a new, data-coupled Thermal-Hydrological-Mechanical-Chemical (T-H-M-C) model in which the four internal components can be freely selected from existing simulation software without merging and cross-combining a diverse set of computational codes. Eight tasks are completed during the project period. The results are reported in five publications, an MS thesis, twelve quarterly, and two annual reports to DOE. Two US patents have also been issued during the project period,more » with one patent application originated prior to the start of the project. The “Multiphase Physical Transport Modeling Method and Modeling System” (U.S. Patent 8,396,693 B2, 2013), a key element in the GHE sub-model solution, is successfully used for EGS studies. The “Geothermal Energy Extraction System and Method" invention (U.S. Patent 8,430,166 B2, 2013) originates from the time of project performance, describing a new fluid flow control solution. The new, coupled T-H-M-C numerical model will help analyzing and designing new, efficient EGS systems.« less
Performance measures for evaluating multi-state projects.
DOT National Transportation Integrated Search
2011-09-01
"Freight transportation projects require an analytic process that considers the impacts of geographic and industry distribution of project : benefits, intermodal impacts, and reliability, as well as the traditional benefits of time savings, safety en...
Stockdale, Susan E; Zuchowski, Jessica; Rubenstein, Lisa V; Sapir, Negar; Yano, Elizabeth M; Altman, Lisa; Fickel, Jacqueline J; McDougall, Skye; Dresselhaus, Timothy; Hamilton, Alison B
Although the patient-centered medical home endorses quality improvement principles, methods for supporting ongoing, systematic primary care quality improvement have not been evaluated. We introduced primary care quality councils at six Veterans Health Administration sites as an organizational intervention with three key design elements: (a) fostering interdisciplinary quality improvement leadership, (b) establishing a structured quality improvement process, and (c) facilitating organizationally aligned frontline quality improvement innovation. Our evaluation objectives were to (a) assess design element implementation, (b) describe implementation barriers and facilitators, and (c) assess successful quality improvement project completion and spread. We analyzed administrative records and conducted interviews with 85 organizational leaders. We developed and applied criteria for assessing design element implementation using hybrid deductive/inductive analytic techniques. All quality councils implemented interdisciplinary leadership and a structured quality improvement process, and all but one completed at least one quality improvement project and a toolkit for spreading improvements. Quality councils were perceived as most effective when service line leaders had well-functioning interdisciplinary communication. Matching positions within leadership hierarchies with appropriate supportive roles facilitated frontline quality improvement efforts. Two key resources were (a) a dedicated internal facilitator with project management, data collection, and presentation skills and (b) support for preparing customized data reports for identifying and addressing practice level quality issues. Overall, quality councils successfully cultivated interdisciplinary, multilevel primary care quality improvement leadership with accountability mechanisms and generated frontline innovations suitable for spread. Practice level performance data and quality improvement project management support were critical. In order to successfully facilitate systematic, sustainable primary care quality improvement, regional and executive health care system leaders should engage interdisciplinary practice level leadership in a priority-setting process that encourages frontline innovation and establish local structures such as quality councils to coordinate quality improvement initiatives, ensure accountability, and promote spread of best practices.
ANALYTICAL CHEMISTRY DIVISION ANNUAL PROGRESS REPORT FOR PERIOD ENDING DECEMBER 31, 1961
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1962-02-01
Research and development progress is reported on analytlcal instrumentation, dlssolver-solution analyses, special research problems, reactor projects analyses, x-ray and spectrochemical analyses, mass spectrometry, optical and electron microscopy, radiochemical analyses, nuclear analyses, inorganic preparations, organic preparations, ionic analyses, infrared spectral studies, anodization of sector coils for the Analog II Cyclotron, quality control, process analyses, and the Thermal Breeder Reactor Projects Analytical Chemistry Laboratory. (M.C.G.)
2006-07-27
unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry
ERIC Educational Resources Information Center
Garofalo, James; Hindelang, Michael J.
The purpose of the document is to identify ways in which National Crime Survey (NCS) data can be used by criminal justice researchers and programs. The report provides an overview of the Application of Victimization Survey Results Project, describes the analytic reports compiled by the project staff, and cites the kinds of systematic information…
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-09-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Review: visual analytics of climate networks
NASA Astrophysics Data System (ADS)
Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.
2015-04-01
Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.
Does Tropical Cyclone Modification Make Sense? A Decision-Analytic Assessment
NASA Astrophysics Data System (ADS)
Klima, K.; Morgan, M. G.; Grossmann, I.
2009-12-01
Since the demise of project Stormfury in 1983, little attention has been devoted to the possibility of intentionally modifying tropical cyclones (TC). However, following Hurricane Katrina and three other Category 5 hurricanes (Emily, Rita, and Wilma), which together resulted in at least 2,280 deaths and over $120-billion in damages (Blake et al., 2007), the U.S. Department of Homeland Security (DHS) has recently begun to support an effort to identify and evaluate hurricane mitigation strategies through Project HURRMIT ([http://www.ofcm.noaa.gov/ihc09/Presentations/Session10/s10-01Woodley.ppt]). Using a decision analytic framing and FEMA's HAZUS-MH MR3 damage model (http://www.fema.gov/plan/prevent/hazus/]), this paper asks, how sure must one be that an intervention will reduce TC damages before choosing to undertake a program of modification? The analysis is formulated in probabilistic terms, and assesses net benefits. In contrast to a much earlier application of decision analysis to TC-modification (Howard et al., 1972) , this work uses census data on the value of property at risk, and prior distributions on changing storm behavior based on data from hurricanes approaching the east coast of Florida since 1953. Even before considering both issues of liability that may arise from the fact that a modified storm is no longer "an act of God" as well as unforeseen environmental consequences, our results suggest that while TC modification techniques will likely alter TC behavior, one will have to be significantly more confident of the predictability and effectiveness of modification methods before their use can be justified. This work is supported by the Climate Decision Making Center through a cooperative agreement between the National Science Foundation (SES-0345798) and Carnegie Mellon University.
IBM’s Health Analytics and Clinical Decision Support
Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.
2014-01-01
Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
NASA Astrophysics Data System (ADS)
Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.
2017-12-01
The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework.
Khazaei, Hamzeh; McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-11-18
Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids' NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution.
Real-Time and Retrospective Health-Analytics-as-a-Service: A Novel Framework
McGregor, Carolyn; Eklund, J Mikael; El-Khatib, Khalil
2015-01-01
Background Analytics-as-a-service (AaaS) is one of the latest provisions emerging from the cloud services family. Utilizing this paradigm of computing in health informatics will benefit patients, care providers, and governments significantly. This work is a novel approach to realize health analytics as services in critical care units in particular. Objective To design, implement, evaluate, and deploy an extendable big-data compatible framework for health-analytics-as-a-service that offers both real-time and retrospective analysis. Methods We present a novel framework that can realize health data analytics-as-a-service. The framework is flexible and configurable for different scenarios by utilizing the latest technologies and best practices for data acquisition, transformation, storage, analytics, knowledge extraction, and visualization. We have instantiated the proposed method, through the Artemis project, that is, a customization of the framework for live monitoring and retrospective research on premature babies and ill term infants in neonatal intensive care units (NICUs). Results We demonstrated the proposed framework in this paper for monitoring NICUs and refer to it as the Artemis-In-Cloud (Artemis-IC) project. A pilot of Artemis has been deployed in the SickKids hospital NICU. By infusing the output of this pilot set up to an analytical model, we predict important performance measures for the final deployment of Artemis-IC. This process can be carried out for other hospitals following the same steps with minimal effort. SickKids’ NICU has 36 beds and can classify the patients generally into 5 different types including surgical and premature babies. The arrival rate is estimated as 4.5 patients per day, and the average length of stay was calculated as 16 days. Mean number of medical monitoring algorithms per patient is 9, which renders 311 live algorithms for the whole NICU running on the framework. The memory and computation power required for Artemis-IC to handle the SickKids NICU will be 32 GB and 16 CPU cores, respectively. The required amount of storage was estimated as 8.6 TB per year. There will always be 34.9 patients in SickKids NICU on average. Currently, 46% of patients cannot get admitted to SickKids NICU due to lack of resources. By increasing the capacity to 90 beds, all patients can be accommodated. For such a provisioning, Artemis-IC will need 16 TB of storage per year, 55 GB of memory, and 28 CPU cores. Conclusions Our contributions in this work relate to a cloud architecture for the analysis of physiological data for clinical decisions support for tertiary care use. We demonstrate how to size the equipment needed in the cloud for that architecture based on a very realistic assessment of the patient characteristics and the associated clinical decision support algorithms that would be required to run for those patients. We show the principle of how this could be performed and furthermore that it can be replicated for any critical care setting within a tertiary institution. PMID:26582268
Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption
ERIC Educational Resources Information Center
Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane
2014-01-01
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…
External Standards or Standard Addition? Selecting and Validating a Method of Standardization
NASA Astrophysics Data System (ADS)
Harvey, David T.
2002-05-01
A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.
1987-02-20
Bacteriology; 8 years professional experience; served as Project Health and Safety Officer. 1-37 o Duane R. Boline - Ph.D. in Analytical Chemistry ; M.S. in... Chemistry ; B.S.E. in Physical Science; 18 years professional experience; served as Project Quality Assurance Officer. Complete biographical data...University, 1962 M.S., Chemistry , Einporia State University 1965 Ph.D., Analytical Chemistry , Kansas State University, 1975
The factor structure of the Alcohol Use Disorders Identification Test (AUDIT).
Doyle, Suzanne R; Donovan, Dennis M; Kivlahan, Daniel R
2007-05-01
Past research assessing the factor structure of the Alcohol Use Disorders Identification Test (AUDIT) with various exploratory and confirmatory factor analytic techniques has identified one-, two-, and three-factor solutions. Because different factor analytic procedures may result in dissimilar findings, we examined the factor structure of the AUDIT using the same factor analytic technique on two new large clinical samples and on archival data from six samples studied in previous reports. Responses to the AUDIT were obtained from participants who met Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), criteria for alcohol dependence in two large randomized clinical trials: the COMBINE (Combining Medications and Behavioral Interventions) Study (N = 1,337; 69% men) and Project MATCH (Matching Alcoholism Treatments to Client Heterogeneity; N = 1,711; 76% men). Supplementary analyses involved six correlation matrices of AUDIT data obtained from five previously published articles. Confirmatory factor analyses based on one-, two-, and three-factor models were conducted on the eight correlation matrices to assess the factor structure of the AUDIT. Across samples, analyses supported a correlated, two-factor solution representing alcohol consumption and alcohol-related consequences. The three-factor solution fit the data equally well, but two factors (alcohol dependence and harmful alcohol use) were highly correlated. The one-factor solution did not provide a good fit to the data. These findings support a two-factor solution for the AUDIT (alcohol consumption and alcohol-related consequences). The results contradict the original three-factor design of the AUDIT and the prevalent use of the AUDIT as a one-factor screening instrument with a single cutoff score.
Pavement Performance : Approaches Using Predictive Analytics
DOT National Transportation Integrated Search
2018-03-23
Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...
Project Management in NASA: The system and the men
NASA Technical Reports Server (NTRS)
Pontious, R. H.; Barnes, L. B.
1973-01-01
An analytical description of the NASA project management system is presented with emphasis on the human element. The NASA concept of project management, program managers, and the problems and strengths of the NASA system are discussed.
Technology-assisted psychoanalysis.
Scharff, Jill Savege
2013-06-01
Teleanalysis-remote psychoanalysis by telephone, voice over internet protocol (VoIP), or videoteleconference (VTC)-has been thought of as a distortion of the frame that cannot support authentic analytic process. Yet it can augment continuity, permit optimum frequency of analytic sessions for in-depth analytic work, and enable outreach to analysands in areas far from specialized psychoanalytic centers. Theoretical arguments against teleanalysis are presented and countered and its advantages and disadvantages discussed. Vignettes of analytic process from teleanalytic sessions are presented, and indications, contraindications, and ethical concerns are addressed. The aim is to provide material from which to judge the authenticity of analytic process supported by technology.
Propagation of Airy Gaussian vortex beams in uniaxial crystals
NASA Astrophysics Data System (ADS)
Weihao, Yu; Ruihuang, Zhao; Fu, Deng; Jiayao, Huang; Chidao, Chen; Xiangbo, Yang; Yanping, Zhao; Dongmei, Deng
2016-04-01
The propagation dynamics of the Airy Gaussian vortex beams in uniaxial crystals orthogonal to the optical axis has been investigated analytically and numerically. The propagation expression of the beams has been obtained. The propagation features of the Airy Gaussian vortex beams are shown with changes of the distribution factor and the ratio of the extraordinary refractive index to the ordinary refractive index. The correlations between the ratio and the maximum intensity value during the propagation, and its appearing distance have been investigated. Project supported by the National Natural Science Foundation of China (Grant Nos. 11374108, 11374107, 10904041, and 11547212), the Foundation of Cultivating Outstanding Young Scholars of Guangdong Province, China, the CAS Key Laboratory of Geospace Environment, University of Science and Technology of China, the National Training Program of Innovation and Entrepreneurship for Undergraduates (Grant No. 2015093), and the Science and Technology Projects of Guangdong Province, China (Grant No. 2013B031800011).
Long-Term Ecological Monitoring Field Sampling Plan for 2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. Haney
2007-07-31
This field sampling plan describes the field investigations planned for the Long-Term Ecological Monitoring Project at the Idaho National Laboratory Site in 2007. This plan and the Quality Assurance Project Plan for Waste Area Groups 1, 2, 3, 4, 5, 6, 7, 10, and Removal Actions constitute the sampling and analysis plan supporting long-term ecological monitoring sampling in 2007. The data collected under this plan will become part of the long-term ecological monitoring data set that is being collected annually. The data will be used t determine the requirements for the subsequent long-term ecological monitoring. This plan guides the 2007more » investigations, including sampling, quality assurance, quality control, analytical procedures, and data management. As such, this plan will help to ensure that the resulting monitoring data will be scientifically valid, defensible, and of known and acceptable quality.« less
Functional expansion representations of artificial neural networks
NASA Technical Reports Server (NTRS)
Gray, W. Steven
1992-01-01
In the past few years, significant interest has developed in using artificial neural networks to model and control nonlinear dynamical systems. While there exists many proposed schemes for accomplishing this and a wealth of supporting empirical results, most approaches to date tend to be ad hoc in nature and rely mainly on heuristic justifications. The purpose of this project was to further develop some analytical tools for representing nonlinear discrete-time input-output systems, which when applied to neural networks would give insight on architecture selection, pruning strategies, and learning algorithms. A long term goal is to determine in what sense, if any, a neural network can be used as a universal approximator for nonliner input-output maps with memory (i.e., realized by a dynamical system). This property is well known for the case of static or memoryless input-output maps. The general architecture under consideration in this project was a single-input, single-output recurrent feedforward network.
Passman, Dina B.
2013-01-01
Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.
Current projects in Pre-analytics: where to go?
Sapino, Anna; Annaratone, Laura; Marchiò, Caterina
2015-01-01
The current clinical practice of tissue handling and sample preparation is multifaceted and lacks strict standardisation: this scenario leads to significant variability in the quality of clinical samples. Poor tissue preservation has a detrimental effect thus leading to morphological artefacts, hampering the reproducibility of immunocytochemical and molecular diagnostic results (protein expression, DNA gene mutations, RNA gene expression) and affecting the research outcomes with irreproducible gene expression and post-transcriptional data. Altogether, this limits the opportunity to share and pool national databases into European common databases. At the European level, standardization of pre-analytical steps is just at the beginning and issues regarding bio-specimen collection and management are still debated. A joint (public-private) project entitled on standardization of tissue handling in pre-analytical procedures has been recently funded in Italy with the aim of proposing novel approaches to the neglected issue of pre-analytical procedures. In this chapter, we will show how investing in pre-analytics may impact both public health problems and practical innovation in solid tumour processing.
Health Informatics for Neonatal Intensive Care Units: An Analytical Modeling Perspective
Mench-Bressan, Nadja; McGregor, Carolyn; Pugh, James Edward
2015-01-01
The effective use of data within intensive care units (ICUs) has great potential to create new cloud-based health analytics solutions for disease prevention or earlier condition onset detection. The Artemis project aims to achieve the above goals in the area of neonatal ICUs (NICU). In this paper, we proposed an analytical model for the Artemis cloud project which will be deployed at McMaster Children’s Hospital in Hamilton. We collect not only physiological data but also the infusion pumps data that are attached to NICU beds. Using the proposed analytical model, we predict the amount of storage, memory, and computation power required for the system. Capacity planning and tradeoff analysis would be more accurate and systematic by applying the proposed analytical model in this paper. Numerical results are obtained using real inputs acquired from McMaster Children’s Hospital and a pilot deployment of the system at The Hospital for Sick Children (SickKids) in Toronto. PMID:27170907
FINAL REPORT. DOE Grant Award Number DE-SC0004062
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiesa, Luisa
With the support of the DOE-OFES Early Career Award and the Tufts startup support the PI has developed experimental and analytical expertise on the electromechanical characterization of Low Temperature Superconductor (LTS) and High Temperature Superconductor (HTS) for high magnetic field applications. These superconducting wires and cables are used in fusion and high-energy physics magnet applications. In a short period of time, the PI has built a laboratory and research group with unique capabilities that include both experimental and numerical modeling effort to improve the design and performance of superconducting cables and magnets. All the projects in the PI’s laboratory exploremore » the fundamental electromechanical behavior of superconductors but the types of materials, geometries and operating conditions are chosen to be directly relevant to real machines, in particular fusion machines like ITER.« less
Carrying BioMath education in a Leaky Bucket.
Powell, James A; Kohler, Brynja R; Haefner, James W; Bodily, Janice
2012-09-01
In this paper, we describe a project-based mathematical lab implemented in our Applied Mathematics in Biology course. The Leaky Bucket Lab allows students to parameterize and test Torricelli's law and develop and compare their own alternative models to describe the dynamics of water draining from perforated containers. In the context of this lab students build facility in a variety of applied biomathematical tools and gain confidence in applying these tools in data-driven environments. We survey analytic approaches developed by students to illustrate the creativity this encourages as well as prepare other instructors to scaffold the student learning experience. Pedagogical results based on classroom videography support the notion that the Biology-Applied Math Instructional Model, the teaching framework encompassing the lab, is effective in encouraging and maintaining high-level cognition among students. Research-based pedagogical approaches that support the lab are discussed.
Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies
Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai
2013-01-01
Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379
NASA Astrophysics Data System (ADS)
Xie, Chuan-Mei; Liu, Yi-Min; Xing, Hang; Zhang, Zhan-Jun
2015-04-01
Quantum correlations in a family of states comprising any mixture of a pair of arbitrary bi-qubit product pure states are studied by employing geometric discord [Phys. Rev. Lett. 105 (2010) 190502] as the quantifier. First, the inherent symmetry in the family of states about local unitary transformations is revealed. Then, the analytic expression of geometric discords in the states is worked out. Some concrete discussions and analyses on the captured geometric discords are made so that their distinct features are exposed. It is found that, the more averagely the two bi-qubit product states are mixed, the bigger geometric discord the mixed state owns. Moreover, the monotonic relationships of geometric discord with different parameters are revealed. Supported by the National Natural Science Foundation of China (NNSFC) under Grant Nos. 11375011 and 11372122, the Natural Science Foundation of Anhui Province under Grant No. 1408085MA12, and the 211 Project of Anhui University
NASA Technical Reports Server (NTRS)
1975-01-01
An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.
Two-dimensional fracture analysis of piezoelectric material based on the scaled boundary node method
NASA Astrophysics Data System (ADS)
Shen-Shen, Chen; Juan, Wang; Qing-Hua, Li
2016-04-01
A scaled boundary node method (SBNM) is developed for two-dimensional fracture analysis of piezoelectric material, which allows the stress and electric displacement intensity factors to be calculated directly and accurately. As a boundary-type meshless method, the SBNM employs the moving Kriging (MK) interpolation technique to an approximate unknown field in the circumferential direction and therefore only a set of scattered nodes are required to discretize the boundary. As the shape functions satisfy Kronecker delta property, no special techniques are required to impose the essential boundary conditions. In the radial direction, the SBNM seeks analytical solutions by making use of analytical techniques available to solve ordinary differential equations. Numerical examples are investigated and satisfactory solutions are obtained, which validates the accuracy and simplicity of the proposed approach. Project supported by the National Natural Science Foundation of China (Grant Nos. 11462006 and 21466012), the Foundation of Jiangxi Provincial Educational Committee, China (Grant No. KJLD14041), and the Foundation of East China Jiaotong University, China (Grant No. 09130020).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raboin, P J
1998-01-01
The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less
Comparative analytics of infusion pump data across multiple hospital systems.
Catlin, Ann Christine; Malloy, William X; Arthur, Karen J; Gaston, Cindy; Young, James; Fernando, Sudheera; Fernando, Ruchith
2015-02-15
A Web-based analytics system for conducting inhouse evaluations and cross-facility comparisons of alert data generated by smart infusion pumps is described. The Infusion Pump Informatics (IPI) project, a collaborative effort led by research scientists at Purdue University, was launched in 2009 to provide advanced analytics and tools for workflow analyses to assist hospitals in determining the significance of smart-pump alerts and reducing nuisance alerts. The IPI system allows facility-specific analyses of alert patterns and trends, as well as cross-facility comparisons of alert data uploaded by more than 55 participating institutions using different types of smart pumps. Tools accessible through the IPI portal include (1) charts displaying aggregated or breakout data on the top drugs associated with alerts, numbers of alerts per device or care area, and override-to-alert ratios, (2) investigative reports that can be used to characterize and analyze pump-programming errors in a variety of ways (e.g., by drug, by infusion type, by time of day), and (3) "drill-down" workflow analytics enabling users to evaluate alert patterns—both internally and in relation to patterns at other hospitals—in a quick and efficient stepwise fashion. The formation of the IPI analytics system to support a community of hospitals has been successful in providing sophisticated tools for member facilities to review, investigate, and efficiently analyze smart-pump alert data, not only within a member facility but also across other member facilities, to further enhance smart pump drug library design. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
xQuake: A Modern Approach to Seismic Network Analytics
NASA Astrophysics Data System (ADS)
Johnson, C. E.; Aikin, K. E.
2017-12-01
While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.
Witkiewitz, Katie; Hartzler, Bryan; Donovan, Dennis
2010-08-01
The current study was designed to re-examine the motivation matching hypothesis from Project MATCH using growth mixture modeling, an analytical technique that models variation in individual drinking patterns. Secondary data analyses of data from Project MATCH (n = 1726), a large multi-site alcoholism treatment-matching study. Percentage of drinking days was the primary outcome measure, assessed from 1 month to 12 months following treatment. Treatment assignment, alcohol dependence symptoms and baseline percentage of drinking days were included as covariates. The results provided support for the motivation matching hypothesis in the out-patient sample and among females in the aftercare sample: the majority of individuals with lower baseline motivation had better outcomes if assigned to motivation enhancement treatment (MET) compared to those assigned to cognitive behavioral treatment (CBT). In the aftercare sample there was a moderating effect of gender and alcohol dependence severity, whereby males with lower baseline motivation and greater alcohol dependence drank more frequently if assigned to MET compared to those assigned to CBT. Results from the current study lend partial support to the motivation-matching hypothesis and also demonstrated the importance of moderating influences on treatment matching effectiveness. Based upon these findings, individuals with low baseline motivation in out-patient settings and males with low levels of alcohol dependence or females in aftercare settings may benefit more from motivational enhancement techniques than from cognitive-behavioral techniques.
Distributed Revisiting: An Analytic for Retention of Coherent Science Learning
ERIC Educational Resources Information Center
Svihla, Vanessa; Wester, Michael J.; Linn, Marcia C.
2015-01-01
Designing learning experiences that support the development of coherent understanding of complex scientific phenomena is challenging. We sought to identify analytics that can also guide such designs to support retention of coherent understanding. Based on prior research that distributing study of material over time supports retention, we explored…
Dynamic mobility applications analytical needs assessment.
DOT National Transportation Integrated Search
2012-07-01
Dynamic Mobility Applications Analytical Needs Assessment was a one-year project (July 2011 to July 2012) to develop a strategy for assessing the potential impact of twenty-eight applications for improved mobility across national transportation syste...
Field Sampling and Selecting On-Site Analytical Methods for Explosives in Soil
The purpose of this issue paper is to provide guidance to Remedial Project Managers regarding field sampling and on-site analytical methods fordetecting and quantifying secondary explosive compounds in soils.
An anisotropic thermal-stress model for through-silicon via
NASA Astrophysics Data System (ADS)
Liu, Song; Shan, Guangbao
2018-02-01
A two-dimensional thermal-stress model of through-silicon via (TSV) is proposed considering the anisotropic elastic property of the silicon substrate. By using the complex variable approach, the distribution of thermal-stress in the substrate can be characterized more accurately. TCAD 3-D simulations are used to verify the model accuracy and well agree with analytical results (< ±5%). The proposed thermal-stress model can be integrated into stress-driven design flow for 3-D IC , leading to the more accurate timing analysis considering the thermal-stress effect. Project supported by the Aerospace Advanced Manufacturing Technology Research Joint Fund (No. U1537208).
SWMM5 Application Programming Interface and PySWMM: A ...
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ... The purpose of this work is to increase the utility of the SWMM dll by creating a Toolkit API for accessing its functionality. The utility of the Toolkit is further enhanced with a wrapper to allow access from the Python scripting language. This work is being prosecuted as part of an Open Source development strategy and is being performed by volunteer software developers.
Characteristics of process oils from HTI coal/plastics co-liquefaction runs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robbins, G.A.; Brandes, S.D.; Winschel, R.A.
1995-12-31
The objective of this project is to provide timely analytical support to DOE`s liquefaction development effort. Specific objectives of the work reported here are presented. During a few operating periods of Run POC-2, HTI co-liquefied mixed plastics with coal, and tire rubber with coal. Although steady-state operation was not achieved during these brief tests periods, the results indicated that a liquefaction plant could operate with these waste materials as feedstocks. CONSOL analyzed 65 process stream samples from coal-only and coal/waste portions of the run. Some results obtained from characterization of samples from Run POC-2 coal/plastics operation are presented.
NASA Astrophysics Data System (ADS)
Zhang, Qing-Kun; Wang, Lin; Li, Wei-Min; Gao, Wei-Wei
2015-12-01
The upgrade project of the Hefei Light Source storage ring is under way. In this paper, the broadband impedances of resistive wall and coated ceramic vacuum chamber are calculated using the analytic formula, and the wake fields and impedances of other designed vacuum chambers are simulated by CST code, and then a broadband impedance model is obtained. Using the theoretical formula, longitudinal and transverse single bunch instabilities are discussed. With the carefully-designed vacuum chamber, we find that the thresholds of the beam instabilities are higher than the beam current goal. Supported by Natural Science Foundation of China (11175182, 11175180)
Finding the forest in the trees. The challenge of combining diverse environmental data
NASA Technical Reports Server (NTRS)
1995-01-01
Development of analytical and functional guidelines to help researchers and technicians engaged in interdisciplinary research to better plan and implement their supporting data management activities is addressed. An emphasis is on the projects that involve both geophysical and ecological issues. Six case studies were used to identify and to understand problems associated with collecting, integrating, and analyzing environmental data from local to global spatial scales and over a range of temporal scales. These case studies were also used to elaborate the common barriers to interfacing data of disparate sources and types. A number of lessons derived from the case studies are summarized and analyzed.
The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research
ERIC Educational Resources Information Center
Siemens, George
2014-01-01
The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.
Dynamic Analytics-Driven Assessment of Vulnerabilities and Exploitation
2016-07-15
integration with big data technologies such as Hadoop , nor does it natively support exporting of events to external relational databases. OSSIM supports...power of big data analytics to determine correlations and temporal causality among vulnerabilities and cyber events. The vulnerability dependencies...via the SCAPE (formerly known as LLCySA [6]). This is illustrated as a big data cyber analytic system architecture in
REAL-TIME MONITORING OF DIOXINS AND OTHER ...
This project is part of EPA's EMPACT program which was begun in 1998 and is jointly administered by EPA's Office of Research and Development, the National Center for Environmental Research and Quality Assurance (NCERQA), and the National Center for Environmental Assessment. The program was developed to provide understandable environmental information on various research initiatives to the public in a timely manner on various issues of importance. This particular project involves development of the application of an on-line, real time, trace organic air toxic monitor, with special emphasis on dioxin-related compounds. Research efforts demonstrate the utility and usefulness of the Resonance Enhanced Multi-Photon Ionization (REMPI) analytical method for trace organics control, monitoring, and compliance assurance. Project objectives will be to develop the REMPI instrumental method into a tool that will be used for assessment of potential dioxin sources, control and prevention of dioxin formation in known sources, and communication of facility performance. This will be accomplished through instrument development, laboratory verification, thermokinetic modelling, equilibrium modelling, statistical determinations, field validation, program publication and presentation, regulatory office support, and development of data communication/presentation procedures. For additional information on this EMPACT project, visit the website at http://www.epa.gov/appcdwww/crb/empa
NASA Astrophysics Data System (ADS)
Banerjee, Polash; Ghose, Mrinal Kanti; Pradhan, Ratika
2018-05-01
Spatial analysis of water quality impact assessment of highway projects in mountainous areas remains largely unexplored. A methodology is presented here for Spatial Water Quality Impact Assessment (SWQIA) due to highway-broadening-induced vehicular traffic change in the East district of Sikkim. Pollution load of the highway runoff was estimated using an Average Annual Daily Traffic-Based Empirical model in combination with mass balance model to predict pollution in the rivers within the study area. Spatial interpolation and overlay analysis were used for impact mapping. Analytic Hierarchy Process-Based Water Quality Status Index was used to prepare a composite impact map. Model validation criteria, cross-validation criteria, and spatial explicit sensitivity analysis show that the SWQIA model is robust. The study shows that vehicular traffic is a significant contributor to water pollution in the study area. The model is catering specifically to impact analysis of the concerned project. It can be an aid for decision support system for the project stakeholders. The applicability of SWQIA model needs to be explored and validated in the context of a larger set of water quality parameters and project scenarios at a greater spatial scale.
MIT CSAIL and Lincoln Laboratory Task Force Report
2016-08-01
projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to
Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.
2015-01-01
Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options. PMID:25848413
Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C
2015-01-01
The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.
ENergy and Power Evaluation Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-11-01
In the late 1970s, national and international attention began to focus on energy issues. Efforts were initiated to design and test analytical tools that could be used to assist energy planners in evaluating energy systems, particularly in developing countries. In 1984, the United States Department of Energy (DOE) commissioned Argonne National Laboratory`s Decision and Information Sciences Division (DIS) to incorporate a set of analytical tools into a personal computer-based package for distribution in developing countries. The package developed by DIS staff, the ENergy and Power Evaluation Program (ENPEP), covers the range of issues that energy planners must face: economic development,more » energy demand projections, supply-and-demand balancing, energy system expansion, and environmental impact analysis. Following the original DOE-supported development effort, the International Atomic Energy Agency (IAEA), with the assistance from the US Department of State (DOS) and the US Department of Energy (DOE), provided ENPEP training, distribution, and technical support to many countries. ENPEP is now in use in over 60 countries and is an international standard for energy planning tools. More than 500 energy experts have been trained in the use of the entire ENPEP package or some of its modules during the international training courses organized by the IAEA in collaboration with Argonne`s Decision and Information Sciences (DIS) Division and the Division of Educational Programs (DEP). This report contains the ENPEP program which can be download from the internet. Described in this report is the description of ENPEP Program, news, forums, online support and contacts.« less
ERIC Educational Resources Information Center
Prati, Gabriele; Pietrantoni, Luca
2010-01-01
There are plenty of theories that may support the protective role of social support in the aftermath of potentially traumatic events. This meta-analytic review examined the role of received and perceived social support in promoting mental health among first responders (e.g., firefighters, police officers, and paramedics or emergency medical…
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
1985-06-28
1984 to April 1985 includedr installation of 27 raw mnitor wells and 11 amh l’,(1 zjit~s, collecticn cf seive~t, sarples ftw surfac , soil, shallowv...modifying a sampling and analytical program that addresses the requirements of the project. If project requirements necessitate different quality...reagent blank and at least five (5) different concentrations of thl analyte. A modification of the method of Hubaux and Vos will be used to deter
Space Projects and Research by Kids (SPARK): A Web Based Research Journal for Middle School Students
NASA Astrophysics Data System (ADS)
Limaye, S. S.; Pertzborn, R. A.
1999-05-01
Project SPARK is designed to facilitate opportunities for upper elementary and middle school students to develop the necessary skills to conduct investigations that focus on the subjects of astronomy, space exploration, and earth remote sensing. This program actively engages students in conducting their own research project to acquire increased understanding and content knowledge in the space sciences. While the development of scientific inquiry skills and content literacy is the primary focus, students also enhance their critical thinking, analytical, technological and communications skills. As in the professional science community, the web based SPARK Journal presents an avenue for students to effectively communicate the results of their investigations and work to classmates as well as the "global learning community" via the world wide web. Educational outreach staff at the Sapce Science and Engineering Center have developed active partnerships with teachers and schools throughout Wisconsin to facilitate the development of standards based curriculum and research projects focusing on current topics in the space sciences. Student research projects and activities arising from these initiatives were submitted in the Spring and Fall of 1998 for inclusion in SPARK, Volume 1. The second volume of SPARK will be published in Spring, 1999. Support for the development of this journal was provided by the NASA/IDEAS Program.
Exploration of Antarctic Subglacial environments: a challenge for analytical chemistry
NASA Astrophysics Data System (ADS)
Traversi, R.; Becagli, S.; Castellano, E.; Ghedini, C.; Marino, F.; Rugi, F.; Severi, M.; Udisti, R.
2009-12-01
The large number of subglacial lakes detected in the Dome C area in East Antarctica suggests that this region may be a valuable source of paleo-records essential for understanding the evolution of the Antarctic ice cap and climate changes in the last several millions years. In the framework of the Project on “Exploration and characterization of Concordia Lake, Antarctica”, supported by Italian Program for Antarctic Research (PNRA), a glaciological investigation of the Dome C “Lake District” are planned. Indeed, the glacio-chemical characterisation of the ice column over subglacial lakes will allow to evaluate the fluxes of major and trace chemical species along the ice column and in the accreted ice and, consequently, the availability of nutrients and oligo-elements for possible biological activity in the lake water and sediments. Melting and freezing at the base of the ice sheet should be able to deliver carbon and salts to the lake, as observed for the Vostok subglacial lake, which are thought to be able to support a low concentration of micro-organisms for extended periods of time. Thus, this investigation represents the first step for exploring the subglacial environments including sampling and analysis of accreted ice, lake water and sediments. In order to perform reliable analytical measurements, especially of trace chemical species, clean sub-sampling and analytical techniques are required. For this purpose, the techniques already used by the CHIMPAC laboratory (Florence University) in the framework of international Antarctic drilling Projects (EPICA - European Project for Ice Coring in Antarctica, TALDICE - TALos Dome ICE core, ANDRILL MIS - ANTarctic DRILLing McMurdo Ice Shelf) were optimised and new techniques were developed to ensure a safe sample handling. CHIMPAC laboratory has been involved since several years in the study of Antarctic continent, primarily focused on understanding the bio-geo-chemical cycles of chemical markers and the interpretation of their records in sedimentary archives (ice cores, sediment cores). This activity takes advantage of facilities for storage, decontamination and pre-analysis treatment of ice and sediment strips (cold room equipped with laminar flow hoods and decontamination devices at different automation level, class 10000 clean room, systems for the complete acid digestion of sediment samples, production of ultra-pure acids and sediments’ granulometric selection) and for analytical determination of a wide range of chemical tracers. In particular, the operative instrumental set includes several Ion Chromatographs for inorganic and selected organic ions measurement (by classical Ion Chromatography and Fast Ion Chromatography), Atomic Absorption and Emission Spectrometers (F-AAS, GF-AAS, ICP-AES) and Inductively Coupled Plasma - Sector Field Mass Spectrometry (ICP-SFMS) for the analysis of the soluble or “available” inorganic fraction together with Ion Beam Analysis techniques for elemental composition (PIXE-PIGE, in collaboration with INFN and Physics Institute of Florence University) and geochemical analysis (SEM-EDS).
NASA Astrophysics Data System (ADS)
Meyer, Hanna; Authmann, Christian; Dreber, Niels; Hess, Bastian; Kellner, Klaus; Morgenthal, Theunis; Nauss, Thomas; Seeger, Bernhard; Tsvuura, Zivanai; Wiegand, Kerstin
2017-04-01
Bush encroachment is a syndrome of land degradation that occurs in many savannas including those of southern Africa. The increase in density, cover or biomass of woody vegetation often has negative effects on a range of ecosystem functions and services, which are hardly reversible. However, despite its importance, neither the causes of bush encroachment, nor the consequences of different resource management strategies to combat or mitigate related shifts in savanna states are fully understood. The project "IDESSA" (An Integrative Decision Support System for Sustainable Rangeland Management in Southern African Savannas) aims to improve the understanding of the complex interplays between land use, climate patterns and vegetation dynamics and to implement an integrative monitoring and decision-support system for the sustainable management of different savanna types. For this purpose, IDESSA follows an innovative approach that integrates local knowledge, botanical surveys, remote-sensing and machine-learning based time-series of atmospheric and land-cover dynamics, spatially explicit simulation modeling and analytical database management. The integration of the heterogeneous data will be implemented in a user oriented database infrastructure and scientific workflow system. Accessible via web-based interfaces, this database and analysis system will allow scientists to manage and analyze monitoring data and scenario computations, as well as allow stakeholders (e. g. land users, policy makers) to retrieve current ecosystem information and seasonal outlooks. We present the concept of the project and show preliminary results of the realization steps towards the integrative savanna management and decision-support system.
CALM: Complex Adaptive System (CAS)-Based Decision Support for Enabling Organizational Change
NASA Astrophysics Data System (ADS)
Adler, Richard M.; Koehn, David J.
Guiding organizations through transformational changes such as restructuring or adopting new technologies is a daunting task. Such changes generate workforce uncertainty, fear, and resistance, reducing morale, focus and performance. Conventional project management techniques fail to mitigate these disruptive effects, because social and individual changes are non-mechanistic, organic phenomena. CALM (for Change, Adaptation, Learning Model) is an innovative decision support system for enabling change based on CAS principles. CALM provides a low risk method for validating and refining change strategies that combines scenario planning techniques with "what-if" behavioral simulation. In essence, CALM "test drives" change strategies before rolling them out, allowing organizations to practice and learn from virtual rather than actual mistakes. This paper describes the CALM modeling methodology, including our metrics for measuring organizational readiness to respond to change and other major CALM scenario elements: prospective change strategies; alternate futures; and key situational dynamics. We then describe CALM's simulation engine for projecting scenario outcomes and its associated analytics. CALM's simulator unifies diverse behavioral simulation paradigms including: adaptive agents; system dynamics; Monte Carlo; event- and process-based techniques. CALM's embodiment of CAS dynamics helps organizations reduce risk and improve confidence and consistency in critical strategies for enabling transformations.
NASA Technical Reports Server (NTRS)
Lueck, Dale E.; Captain, Janine E.; Gibson, Tracy L.; Peterson, Barbara V.; Berger, Cristina M.; Levine, Lanfang
2008-01-01
The RESOLVE project requires an analytical system to identify and quantitate the volatiles released from a lunar drill core sample as it is crushed and heated to 150 C. The expected gases and their range of concentrations were used to assess Gas Chromatography (GC) and Mass Spectrometry (MS), along with specific analyzers for use on this potential lunar lander. The ability of these systems to accurately quantitate water and hydrogen in an unknown matrix led to the selection of a small MEMS commercial process GC for use in this project. The modification, development and testing of this instrument for the specific needs of the project is covered.
Inhibition of viscous fluid fingering: A variational scheme for optimal flow rates
NASA Astrophysics Data System (ADS)
Miranda, Jose; Dias, Eduardo; Alvarez-Lacalle, Enrique; Carvalho, Marcio
2012-11-01
Conventional viscous fingering flow in radial Hele-Shaw cells employs a constant injection rate, resulting in the emergence of branched interfacial shapes. The search for mechanisms to prevent the development of these bifurcated morphologies is relevant to a number of areas in science and technology. A challenging problem is how best to choose the pumping rate in order to restrain growth of interfacial amplitudes. We use an analytical variational scheme to look for the precise functional form of such an optimal flow rate. We find it increases linearly with time in a specific manner so that interface disturbances are minimized. Experiments and nonlinear numerical simulations support the effectiveness of this particularly simple, but not at all obvious, pattern controlling process. J.A.M., E.O.D. and M.S.C. thank CNPq/Brazil for financial support. E.A.L. acknowledges support from Secretaria de Estado de IDI Spain under project FIS2011-28820-C02-01.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
FETC/EPRI Biomass Cofiring Cooperative Agreement. Quarterly technical report, April 1-June 30, 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, E.; Tillman, D.
1997-12-01
The FETC/EPRI Biomass Cofiring Program has accelerated the pace of cofiring development by increasing the testing activities plus the support activities for interpreting test results. Past tests conducted and analyzed include the Allen Fossil Plant and Seward Generating Station programs. On-going tests include the Colbert Fossil Plant precommercial test program, the Greenidge Station commercialization program, and the Blount St. Station switchgrass program. Tests in the formative stages included the NIPSCO cofiring test at Michigan City Generating Station. Analytical activities included modeling and related support functions required to analyze the cofiring test results, and to place those results into context. Amongmore » these activities is the fuel availability study in the Pittsburgh, PA area. This study, conducted for Duquesne Light, supports their initial investigation into reburn technology using wood waste as a fuel. This Quarterly Report, covering the third quarter of the FETC/EPRI Biomass Cofiring Program, highlights the progress made on the 16 projects funded under this cooperative agreement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witwer, Keith S.; Dysland, Eric J.; Garfield, J. S.
2008-02-22
The GeoMelt® In-Container Vitrification™ (ICV™) process was selected by the U.S. Department of Energy (DOE) in 2004 for further evaluation as the supplemental treatment technology for Hanford’s low-activity waste (LAW). Also referred to as “bulk vitrification,” this process combines glass forming minerals, LAW, and chemical amendments; dries the mixture; and then vitrifies the material in a refractory-lined steel container. AMEC Nuclear Ltd. (AMEC) is adapting its GeoMelt ICV™ technology for this application with technical and analytical support from Pacific Northwest National Laboratory (PNNL). The DVBS project is funded by the DOE Office of River Protection and administered by CH2M HILLmore » Hanford Group, Inc. The Demonstration Bulk Vitrification Project (DBVS) was initiated to engineer, construct, and operate a full-scale bulk vitrification pilot-plant to treat up to 750,000 liters of LAW from Waste Tank 241-S-109 at the DOE Hanford Site. Since the beginning of the DBVS project in 2004, testing has used laboratory, crucible-scale, and engineering-scale equipment to help establish process limitations of selected glass formulations and identify operational issues. Full-scale testing has provided critical design verification of the ICV™ process before operating the Hanford pilot-plant. In 2007, the project’s fifth full-scale test, called FS-38D, (also known as the Integrated Dryer Melter Test, or IDMT,) was performed. This test had three primary objectives: 1) Demonstrate the simultaneous and integrated operation of the ICV™ melter with a 10,000-liter dryer, 2) Demonstrate the effectiveness of a new feed reformulation and change in process methodology towards reducing the production and migration of molten ionic salts (MIS), and, 3) Demonstrate that an acceptable glass product is produced under these conditions. Testing was performed from August 8 to 17, 2007. Process and analytical results demonstrated that the primary test objectives, along with a dozen supporting objectives, were successfully met. Glass performance exceeded all disposal performance criteria. A previous issue with MIS containment was successfully resolved in FS-38D, and the ICV™ melter was integrated with a full-scale, 10,000-liter dryer. This paper describes the rationale for performing the test, the purpose and outcome of scale-up tests preceding it, and the performance and outcome of FS-38D.« less
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevolo, A.J.; Kjartanson, B.H.; Wonder, J.D.
1996-03-01
The goal of the Ames Expedited Site Characterization (ESC) project is to evaluate and promote both innovative technologies (IT) and state-of-the-practice technologies (SOPT) for site characterization and monitoring. In April and May 1994, the ESC project conducted site characterization, technology comparison, and stakeholder demonstration activities at a former manufactured gas plant (FMGP) owned by Iowa Electric Services (IES) Utilities, Inc., in Marshalltown, Iowa. Three areas of technology were fielded at the Marshalltown FMGP site: geophysical, analytical and data integration. The geophysical technologies are designed to assess the subsurface geological conditions so that the location, fate and transport of the targetmore » contaminants may be assessed and forecasted. The analytical technologies/methods are designed to detect and quantify the target contaminants. The data integration technology area consists of hardware and software systems designed to integrate all the site information compiled and collected into a conceptual site model on a daily basis at the site; this conceptual model then becomes the decision-support tool. Simultaneous fielding of different methods within each of the three areas of technology provided data for direct comparison of the technologies fielded, both SOPT and IT. This document reports the results of the site characterization, technology comparison, and ESC demonstration activities associated with the Marshalltown FMGP site. 124 figs., 27 tabs.« less
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakafuji, Dora; Gouveia, Lauren
This project supports development of the next generation, integrated energy management infrastructure (EMS) able to incorporate advance visualization of behind-the-meter distributed resource information and probabilistic renewable energy generation forecasts to inform real-time operational decisions. The project involves end-users and active feedback from an Utility Advisory Team (UAT) to help inform how information can be used to enhance operational functions (e.g. unit commitment, load forecasting, Automatic Generation Control (AGC) reserve monitoring, ramp alerts) within two major EMS platforms. Objectives include: Engaging utility operations personnel to develop user input on displays, set expectations, test and review; Developing ease of use and timelinessmore » metrics for measuring enhancements; Developing prototype integrated capabilities within two operational EMS environments; Demonstrating an integrated decision analysis platform with real-time wind and solar forecasting information and timely distributed resource information; Seamlessly integrating new 4-dimensional information into operations without increasing workload and complexities; Developing sufficient analytics to inform and confidently transform and adopt new operating practices and procedures; Disseminating project lessons learned through industry sponsored workshops and conferences;Building on collaborative utility-vendor partnership and industry capabilities« less
Influence of consumers' cognitive style on results from projective mapping.
Varela, Paula; Antúnez, Lucía; Berget, Ingunn; Oliveira, Denize; Christensen, Kasper; Vidal, Leticia; Naes, Tormod; Ares, Gastón
2017-09-01
Projective mapping (PM), one of the most holistic product profiling methods in approach, is increasingly being used to uncover consumers' perception of products and packages. Assessors rely on a process of synthesis for evaluating product information, which would determine the relative importance of the perceived characteristics they use for mapping them. Individual differences are expected, as participants are not instructed on the characteristics to consider for evaluating the degree of difference among samples, generating different perceptual spaces. Individual differences in cognitive style can affect synthesis processes and thus their perception of similarities and differences among samples. In this study, the influence of the cognitive style in the results of PM was explored. Two consumer studies were performed, one aimed at describing intrinsic sensory characteristics of chocolate flavoured milk and the other one looking into extrinsic (package only) of blueberry yogurts. Consumers completed the wholistic-analytic module of the extended Verbal Imagery Cognitive Styles Test & Extended Cognitive Style Analysis-Wholistic Analytic Test, to characterize their cognitive style. Differences between wholistic and analytic consumers in how they evaluated samples using projective mapping were found in both studies. Analytics separated the samples more in the PM perceptual space than wholistic consumers, showing more discriminating abilities. This may come from a deeper analysis of the samples, both from intrinsic and extrinsic point of views. From a sensory perspective (intrinsic), analytic consumers relied on more sensory characteristics, while wholistic mainly discriminated samples according to sweetness and bitterness/chocolate flavour. In the extrinsic study however, even if analytic consumers discriminated more between packs, they described the products using similar words in the descriptive step. One important recommendation coming from this study is the need to consider higher dimensions in the interpretation of projective mapping tasks, as the first dimensions could underestimate the complexity of the perceptual space; currently, most applications of PM consider two dimensions only, which may not uncover the perception of specific groups of consumers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Integrated Data & Analysis in Support of Informed and Transparent Decision Making
NASA Astrophysics Data System (ADS)
Guivetchi, K.
2012-12-01
The California Water Plan includes a framework for improving water reliability, environmental stewardship, and economic stability through two initiatives - integrated regional water management to make better use of local water sources by integrating multiple aspects of managing water and related resources; and maintaining and improving statewide water management systems. The Water Plan promotes ways to develop a common approach for data standards and for understanding, evaluating, and improving regional and statewide water management systems, and for common ways to evaluate and select from alternative management strategies and projects. The California Water Plan acknowledges that planning for the future is uncertain and that change will continue to occur. It is not possible to know for certain how population growth, land use decisions, water demand patterns, environmental conditions, the climate, and many other factors that affect water use and supply may change by 2050. To anticipate change, our approach to water management and planning for the future needs to consider and quantify uncertainty, risk, and sustainability. There is a critical need for information sharing and information management to support over-arching and long-term water policy decisions that cross-cut multiple programs across many organizations and provide a common and transparent understanding of water problems and solutions. Achieving integrated water management with multiple benefits requires a transparent description of dynamic linkages between water supply, flood management, water quality, land use, environmental water, and many other factors. Water Plan Update 2013 will include an analytical roadmap for improving data, analytical tools, and decision-support to advance integrated water management at statewide and regional scales. It will include recommendations for linking collaborative processes with technical enhancements, providing effective analytical tools, and improving and sharing data and information. Specifically, this includes achieving better integration and consistency with other planning activities; obtaining consensus on quantitative deliverables; building a common conceptual understanding of the water management system; developing common schematics of the water management system; establishing modeling protocols and standards; and improving transparency and exchange of Water Plan information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Witwer, K.S.; Dysland, E.J.; Garfield, J.S.
2008-07-01
The GeoMelt{sup R} In-Container Vitrification{sup TM} (ICV{sup TM}) process was selected by the U.S. Department of Energy (DOE) in 2004 for further evaluation as the supplemental treatment technology for Hanford's low-activity waste (LAW). Also referred to as 'bulk vitrification', this process combines glass forming minerals, LAW, and chemical amendments; dries the mixture; and then vitrifies the material in a refractory-lined steel container. AMEC Nuclear Ltd. (AMEC) is adapting its GeoMelt ICV{sup TM} technology for this application with technical and analytical support from Pacific Northwest National Laboratory (PNNL). The DVBS project is funded by the DOE Office of River Protection andmore » administered by CH2M HILL Hanford Group, Inc. The Demonstration Bulk Vitrification Project (DBVS) was initiated to engineer, construct, and operate a full-scale bulk vitrification pilot-plant to treat up to 750,000 liters of LAW from Waste Tank 241-S-109 at the DOE Hanford Site. Since the beginning of the DBVS project in 2004, testing has used laboratory, crucible-scale, and engineering-scale equipment to help establish process limitations of selected glass formulations and identify operational issues. Full-scale testing has provided critical design verification of the ICV{sup TM} process before operating the Hanford pilot-plant. In 2007, the project's fifth full-scale test, called FS-38D, (also known as the Integrated Dryer Melter Test, or IDMT,) was performed. This test had three primary objectives: 1) Demonstrate the simultaneous and integrated operation of the ICV{sup TM} melter with a 10,000- liter dryer, 2) Demonstrate the effectiveness of a new feed reformulation and change in process methodology towards reducing the production and migration of molten ionic salts (MIS), and, 3) Demonstrate that an acceptable glass product is produced under these conditions. Testing was performed from August 8 to 17, 2007. Process and analytical results demonstrated that the primary test objectives, along with a dozen supporting objectives, were successfully met. Glass performance exceeded all disposal performance criteria. A previous issue with MIS containment was successfully resolved in FS-38D, and the ICV{sup TM} melter was integrated with a full-scale, 10,000-liter dryer. This paper describes the rationale for performing the test, the purpose and outcome of scale-up tests preceding it, and the performance and outcome of FS-38D. (authors)« less
Desselle, Mathilde R.
2017-01-01
During a week-long celebration of science, run under the federally supported National Science Week umbrella, the Catch a Rising Star: women in Queensland research (CaRS) programme flew scientists who identify as women to nine regional and remote communities in the Australian State of Queensland. The aim of the project was twofold: first, to bring science to remote and regional communities in a large, economically diverse state; and second, to determine whether media and public engagement provides career advancement opportunities for women scientists. This paper focuses on the latter goal. The data show: (i) a substantial majority (greater than 80%) of researchers thought the training and experience provided by the programme would help develop her career as a research scientist in the future, (ii) the majority (65%) thought the programme would help relate her research to end users, industry partners or stakeholders in the future, and (iii) analytics can help create a compelling narrative around engagement metrics and help to quantify influence. During the week-long project, scientists reached 600 000 impressions on one social media platform (Twitter) using a program hashtag. The breadth and depth of the project outcomes indicate funding bodies and employers could use similar data as an informative source of metrics to support hiring and promotion decisions. Although this project focused on researchers who identify as women, the lessons learned are applicable to researchers representing a diverse range of backgrounds. Future surveys will help determine whether the CaRS programme provided long-term career advantages to participating scientists and communities. PMID:29134069
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
RUPTURES IN THE ANALYTIC SETTING AND DISTURBANCES IN THE TRANSFORMATIONAL FIELD OF DREAMS.
Brown, Lawrence J
2015-10-01
This paper explores some implications of Bleger's (1967, 2013) concept of the analytic situation, which he views as comprising the analytic setting and the analytic process. The author discusses Bleger's idea of the analytic setting as the depositary for projected painful aspects in either the analyst or patient or both-affects that are then rendered as nonprocess. In contrast, the contents of the analytic process are subject to an incessant process of transformation (Green 2005). The author goes on to enumerate various components of the analytic setting: the nonhuman, object relational, and the analyst's "person" (including mental functioning). An extended clinical vignette is offered as an illustration. © 2015 The Psychoanalytic Quarterly, Inc.
LOVE CANAL MONITORING PROGRAM. VOLUME 1
This report summarizes the prime contractor activities during the monitoring phase of the Love Canal project. Since GCA Corporation was only responsible for data collection, no analytical results appear in this report. The program involved a multifaceted sampling and analytical e...
Knowledge Transfer among Projects Using a Learn-Forget Model
ERIC Educational Resources Information Center
Tukel, Oya I.; Rom, Walter O.; Kremic, Tibor
2008-01-01
Purpose: The purpose of this paper is to analyze the impact of learning in a project-driven organization and demonstrate analytically how the learning, which takes place during the execution of successive projects, and the forgetting that takes place during the dormant time between the project executions, can impact performance and productivity in…
NASA Technical Reports Server (NTRS)
Holman, Gordon
2010-01-01
Accelerated electrons play an important role in the energetics of solar flares. Understanding the process or processes that accelerate these electrons to high, nonthermal energies also depends on understanding the evolution of these electrons between the acceleration region and the region where they are observed through their hard X-ray or radio emission. Energy losses in the co-spatial electric field that drives the current-neutralizing return current can flatten the electron distribution toward low energies. This in turn flattens the corresponding bremsstrahlung hard X-ray spectrum toward low energies. The lost electron beam energy also enhances heating in the coronal part of the flare loop. Extending earlier work by Knight & Sturrock (1977), Emslie (1980), Diakonov & Somov (1988), and Litvinenko & Somov (1991), I have derived analytical and semi-analytical results for the nonthermal electron distribution function and the self-consistent electric field strength in the presence of a steady-state return-current. I review these results, presented previously at the 2009 SPD Meeting in Boulder, CO, and compare them and computed X-ray spectra with numerical results obtained by Zharkova & Gordovskii (2005, 2006). The phYSical significance of similarities and differences in the results will be emphasized. This work is supported by NASA's Heliophysics Guest Investigator Program and the RHESSI Project.
Determinants of project success
NASA Technical Reports Server (NTRS)
Murphy, D. C.; Baker, B. N.; Fisher, D.
1974-01-01
The interactions of numerous project characteristics, with particular reference to project performance, were studied. Determinants of success are identified along with the accompanying implications for client organization, parent organization, project organization, and future research. Variables are selected which are found to have the greatest impact on project outcome, and the methodology and analytic techniques to be employed in identification of those variables are discussed.
Space Launch System Vibration Analysis Support
NASA Technical Reports Server (NTRS)
Johnson, Katie
2016-01-01
The ultimate goal for my efforts during this internship was to help prepare for the Space Launch System (SLS) integrated modal test (IMT) with Rodney Rocha. In 2018, the Structural Engineering Loads and Dynamics Team will have 10 days to perform the IMT on the SLS Integrated Launch Vehicle. After that 10 day period, we will have about two months to analyze the test data and determine whether the integrated vehicle modes/frequencies are adequate for launching the vehicle. Because of the time constraints, NASA must have newly developed post-test analysis methods proven well and with technical confidence before testing. NASA civil servants along with help from rotational interns are working with novel techniques developed and applied external to Johnson Space Center (JSC) to uncover issues in applying this technique to much larger scales than ever before. We intend to use modal decoupling methods to separate the entangled vibrations coming from the SLS and its support structure during the IMT. This new approach is still under development. The primary goal of my internship was to learn the basics of structural dynamics and physical vibrations. I was able to accomplish this by working on two experimental test set ups, the Simple Beam and TAURUS-T, and by doing some light analytical and post-processing work. Within the Simple Beam project, my role involves changing the data acquisition system, reconfiguration of the test set up, transducer calibration, data collection, data file recovery, and post-processing analysis. Within the TAURUS-T project, my duties included cataloging and removing the 30+ triaxial accelerometers, coordinating the removal of the structure from the current rolling cart to a sturdy billet for further testing, preparing the accelerometers for remounting, accurately calibrating, mounting, and mapping of all accelerometer channels, and some testing. Hammer and shaker tests will be performed to easily visualize mode shapes at low frequencies. Short analytical projects using MATLAB were also assigned to aid in research efforts. These included integration of acceleration data for comparison to measured displacement data. Laplace and Fourier transforms were also investigated to determine viability as a method of modal decoupling. In addition to these projects, I was also able contribute work that would benefit future interns and the division as a whole. I gave a short presentation and answered questions to aid in the recruitment of subsequent interns and co-ops for the division. I also assisted in revisions and additions to Intern/Co-Op Handbook to provide incoming employees with background information on the organization they are about to work for. I further developed tutorial on Pulse software, which was used for data acquisition for both experiments and will be helpful to interns and engineers that may be unfamiliar to the software. I gained a diverse range of experience throughout my internship. I was introduced to advanced dynamics and analytical techniques. This was through new experience with both hands on experimentation and analytical post processing methods. I was exposed to the benefits of interdepartmental collaboration and developed stronger skills in time management by coordinating two different tests at once. This internship provided an excellent opportunity to see how engineering theories applied to real life scenarios, and an introduction to how NASA/JSC solves technical problems.
Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging
NASA Astrophysics Data System (ADS)
Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee
2017-08-01
Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed.
100-B/C Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.W. Ovink
2010-03-18
This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
Code of Federal Regulations, 2011 CFR
2011-01-01
... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...
Student use of Web 2.0 tools to support argumentation in a high school science classroom
NASA Astrophysics Data System (ADS)
Weible, Jennifer L.
This ethnographic study is an investigation into how two classes of chemistry students (n=35) from a low-income high school with a one-to-one laptop initiative used Web 2.0 tools to support participation in the science practice of argumentation (i.e., sensemaking, articulating understandings, and persuading an audience) during a unit on alternative energy. The science curriculum utilized the Technology-Enhanced Inquiry Tools for Science Education as a pedagogical framework (Kim, Hannafin, & Bryan, 2007). Video recordings of the classroom work, small group discussions, and focus group interviews, documents, screen shots, wiki evidence, and student produced multi-media artifacts were the data analyzed for this study. Open and focused coding techniques, counts of social tags and wiki moves, and interpretive analyses were used to find patterns in the data. The study found that the tools of social bookmarking, wiki, and persuasive multimedia artifacts supported participation in argumentation. In addition, students utilized the affordances of the technologies in multiple ways to communicate, collaborate, manage the work of others, and efficiently complete their science project. This study also found that technologically enhanced science curriculum can bridge students' everyday and scientific understandings of making meaning, articulating understandings, and persuading others of their point of view. As a result, implications from this work include a set of design principles for science inquiry learning that utilize technology. This study suggests new consideration of analytical methodology that blends wiki data analytics and video data. It also suggests that utilizing technology as a bridging strategy serves two roles within classrooms: (a) deepening students' understanding of alternative energy science content and (b) supporting students as they learn to participate in the practices of argumentation.
USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality
Ludtke, Amy S.; Woodworth, Mark T.
1997-01-01
The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.
2016-04-01
The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.
A Multidimensional Data Warehouse for Community Health Centers
Kunjan, Kislaya; Toscos, Tammy; Turkcan, Ayten; Doebbeling, Brad N.
2015-01-01
Community health centers (CHCs) play a pivotal role in healthcare delivery to vulnerable populations, but have not yet benefited from a data warehouse that can support improvements in clinical and financial outcomes across the practice. We have developed a multidimensional clinic data warehouse (CDW) by working with 7 CHCs across the state of Indiana and integrating their operational, financial and electronic patient records to support ongoing delivery of care. We describe in detail the rationale for the project, the data architecture employed, the content of the data warehouse, along with a description of the challenges experienced and strategies used in the development of this repository that may help other researchers, managers and leaders in health informatics. The resulting multidimensional data warehouse is highly practical and is designed to provide a foundation for wide-ranging healthcare data analytics over time and across the community health research enterprise. PMID:26958297
A Multidimensional Data Warehouse for Community Health Centers.
Kunjan, Kislaya; Toscos, Tammy; Turkcan, Ayten; Doebbeling, Brad N
2015-01-01
Community health centers (CHCs) play a pivotal role in healthcare delivery to vulnerable populations, but have not yet benefited from a data warehouse that can support improvements in clinical and financial outcomes across the practice. We have developed a multidimensional clinic data warehouse (CDW) by working with 7 CHCs across the state of Indiana and integrating their operational, financial and electronic patient records to support ongoing delivery of care. We describe in detail the rationale for the project, the data architecture employed, the content of the data warehouse, along with a description of the challenges experienced and strategies used in the development of this repository that may help other researchers, managers and leaders in health informatics. The resulting multidimensional data warehouse is highly practical and is designed to provide a foundation for wide-ranging healthcare data analytics over time and across the community health research enterprise.
Eggert, Corinne; Moselle, Kenneth; Protti, Denis; Sanders, Dale
2017-01-01
Closed Loop Analytics© is receiving growing interest in healthcare as a term referring to information technology, local data and clinical analytics working together to generate evidence for improvement. The Closed Loop Analytics model consists of three loops corresponding to the decision-making levels of an organization and the associated data within each loop - Patients, Protocols, and Populations. The authors propose that each of these levels should utilize the same ecosystem of electronic health record (EHR) and enterprise data warehouse (EDW) enabled data, in a closed-loop fashion, with that data being repackaged and delivered to suit the analytic and decision support needs of each level, in support of better outcomes.
Diagnosis, referral, and rehabilitation within the Fairfax Alcohol Safety Action Project, 1974.
DOT National Transportation Integrated Search
1975-01-01
This report is a combination of Analytic Study #5 (Diagnosis and Referral) and Analytic Study #6 (Rehabilitation). Data concerning these countermeasures are presented together because of their very close relationship within the Fairfax ASAP. Both the...
Project Summary. ANALYTICAL ELEMENT MODELING OF COASTAL AQUIFERS
Four topics were studied concerning the modeling of groundwater flow in coastal aquifers with analytic elements: (1) practical experience was obtained by constructing a groundwater model of the shallow aquifers below the Delmarva Peninsula USA using the commercial program MVAEM; ...
MEETING DATA QUALITY OBJECTIVES WITH INTERVAL INFORMATION
Immunoassay test kits are promising technologies for measuring analytes under field conditions. Frequently, these field-test kits report the analyte concentrations as falling in an interval between minimum and maximum values. Many project managers use field-test kits only for scr...
NASA Technical Reports Server (NTRS)
Edwards, Daryl A.
2008-01-01
Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2's support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed in the early 1960s to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Exhaust system performance, including understanding the present facility capabilities, is the primary focus of this work. A variety of approaches and analytical tools are being employed to gain this understanding. This presentation discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.
NASA Technical Reports Server (NTRS)
Edwards, Daryl A.
2007-01-01
Preparing NASA's Plum Brook Station's Spacecraft Propulsion Research Facility (B-2) to support NASA's new generation of launch vehicles has raised many challenges for B-2 s support staff. The facility provides a unique capability to test chemical propulsion systems/vehicles while simulating space thermal and vacuum environments. Designed and constructed 4 decades ago to support upper stage cryogenic engine/vehicle system development, the Plum Brook Station B-2 facility will require modifications to support the larger, more powerful, and more advanced engine systems for the next generation of vehicles leaving earth's orbit. Engine design improvements over the years have included large area expansion ratio nozzles, greater combustion chamber pressures, and advanced materials. Consequently, it has become necessary to determine what facility changes are required and how the facility can be adapted to support varying customers and their specific test needs. Instrumental in this task is understanding the present facility capabilities and identifying what reasonable changes can be implemented. A variety of approaches and analytical tools are being employed to gain this understanding. This paper discusses some of the challenges in applying these tools to this project and expected facility configuration to support the varying customer needs.
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert
2015-07-01
Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.
International multi-site survey on the use of online support groups in bipolar disorder.
Bauer, Rita; Conell, Jörn; Glenn, Tasha; Alda, Martin; Ardau, Raffaella; Baune, Bernhard T; Berk, Michael; Bersudsky, Yuly; Bilderbeck, Amy; Bocchetta, Alberto; Bossini, Letizia; Castro, Angela M Paredes; Cheung, Eric Y W; Chillotti, Caterina; Choppin, Sabine; Zompo, Maria Del; Dias, Rodrigo; Dodd, Seetal; Duffy, Anne; Etain, Bruno; Fagiolini, Andrea; Hernandez, Miryam Fernández; Garnham, Julie; Geddes, John; Gildebro, Jonas; Gonzalez-Pinto, Ana; Goodwin, Guy M; Grof, Paul; Harima, Hirohiko; Hassel, Stefanie; Henry, Chantal; Hidalgo-Mazzei, Diego; Kapur, Vaisnvy; Kunigiri, Girish; Lafer, Beny; Larsen, Erik R; Lewitzka, Ute; Licht, Rasmus W; Hvenegaard Lund, Anne; Misiak, Blazej; Piotrowski, Patryk; Monteith, Scott; Munoz, Rodrigo; Nakanotani, Takako; Nielsen, René E; O'donovan, Claire; Okamura, Yasushi; Osher, Yamima; Reif, Andreas; Ritter, Philipp; Rybakowski, Janusz K; Sagduyu, Kemal; Sawchuk, Brett; Schwartz, Elon; Scippa, Ângela M; Slaney, Claire; Sulaiman, Ahmad H; Suominen, Kirsi; Suwalska, Aleksandra; Tam, Peter; Tatebayashi, Yoshitaka; Tondo, Leonardo; Vieta, Eduard; Vinberg, Maj; Viswanath, Biju; Volkert, Julia; Zetin, Mark; Whybrow, Peter C; Bauer, Michael
2017-08-01
Peer support is an established component of recovery from bipolar disorder, and online support groups may offer opportunities to expand the use of peer support at the patient's convenience. Prior research in bipolar disorder has reported value from online support groups. To understand the use of online support groups by patients with bipolar disorder as part of a larger project about information seeking. The results are based on a one-time, paper-based anonymous survey about information seeking by patients with bipolar disorder, which was translated into 12 languages. The survey was completed between March 2014 and January 2016 and included questions on the use of online support groups. All patients were diagnosed by a psychiatrist. Analysis included descriptive statistics and general estimating equations to account for correlated data. The survey was completed by 1222 patients in 17 countries. The patients used the Internet at a percentage similar to the general public. Of the Internet users who looked online for information about bipolar disorder, only 21.0% read or participated in support groups, chats, or forums for bipolar disorder (12.8% of the total sample). Given the benefits reported in prior research, clarification of the role of online support groups in bipolar disorder is needed. With only a minority of patients using online support groups, there are analytical challenges for future studies.
Simulated 'On-Line' Wear Metal Analysis of Lubricating Oils by X-Ray Fluorescence Spectroscopy
NASA Technical Reports Server (NTRS)
Kelliher, Warren C.; Partos, Richard D.; Nelson, Irina
1996-01-01
The objective of this project was to assess the sensitivity of X-ray Fluorescence Spectroscopy (XFS) for quantitative evaluation of metal particle content in engine oil suspensions and the feasibility of real-time, dynamic wear metal analysis. The study was focused on iron as the majority wear metal component. Variable parameters were: particle size, particle concentration and oil velocity. A commercial XFS spectrometer equipped with interchangeable static/dynamic (flow cell) sample chambers was used. XFS spectra were recorded for solutions of Fe-organometallic standard and for a series of DTE oil suspensions of high purity spherical iron particles of 2g, 4g, and 8g diameter, at concentrations from 5 ppm to 5,000 ppm. Real contaminated oil samples from Langley Air Force Base aircraft engines and NASA Langley Research Center wind tunnels were also analyzed. The experimental data conform the reliability of XFS as the analytical method of choice for this project. Intrinsic inadequacies of the instrument for precise analytic work at low metal concentrations were identified as being related to the particular x-ray beam definition, system geometry, and flow-cell materials selection. This work supports a proposal for the design, construction and testing of a conceptually new, miniature XFS spectrometer with superior performance, dedicated to on-line, real-time monitoring of lubricating oils in operating engines. Innovative design solutions include focalization of the incident x-ray beam, non-metal sample chamber, and miniaturization of the overall assembly. The instrument would contribute to prevention of catastrophic engine failures. A proposal for two-year funding has been presented to NASA Langley Research Center Internal Operation Group (IOG) Management, to continue the effort begun by this summer's project.
Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.
2011-01-01
This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.
NASA Astrophysics Data System (ADS)
Ryan, J. G.
2012-12-01
Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as well as requesting additional Geology elective courses offering similar kinds of experiences. I have sustained these activities post-project via student lab fees to pay for in-class microprobe time.
Brack, Werner; Altenburger, Rolf; Schüürmann, Gerrit; Krauss, Martin; López Herráez, David; van Gils, Jos; Slobodnik, Jaroslav; Munthe, John; Gawlik, Bernd Manfred; van Wezel, Annemarie; Schriks, Merijn; Hollender, Juliane; Tollefsen, Knut Erik; Mekenyan, Ovanes; Dimitrov, Saby; Bunke, Dirk; Cousins, Ian; Posthuma, Leo; van den Brink, Paul J; López de Alda, Miren; Barceló, Damià; Faust, Michael; Kortenkamp, Andreas; Scrimshaw, Mark; Ignatova, Svetlana; Engelen, Guy; Massmann, Gudrun; Lemkine, Gregory; Teodorovic, Ivana; Walz, Karl-Heinz; Dulio, Valeria; Jonker, Michiel T O; Jäger, Felix; Chipman, Kevin; Falciani, Francesco; Liska, Igor; Rooke, David; Zhang, Xiaowei; Hollert, Henner; Vrana, Branislav; Hilscherova, Klara; Kramer, Kees; Neumann, Steffen; Hammerbacher, Ruth; Backhaus, Thomas; Mack, Juliane; Segner, Helmut; Escher, Beate; de Aragão Umbuzeiro, Gisela
2015-01-15
SOLUTIONS (2013 to 2018) is a European Union Seventh Framework Programme Project (EU-FP7). The project aims to deliver a conceptual framework to support the evidence-based development of environmental policies with regard to water quality. SOLUTIONS will develop the tools for the identification, prioritisation and assessment of those water contaminants that may pose a risk to ecosystems and human health. To this end, a new generation of chemical and effect-based monitoring tools is developed and integrated with a full set of exposure, effect and risk assessment models. SOLUTIONS attempts to address legacy, present and future contamination by integrating monitoring and modelling based approaches with scenarios on future developments in society, economy and technology and thus in contamination. The project follows a solutions-oriented approach by addressing major problems of water and chemicals management and by assessing abatement options. SOLUTIONS takes advantage of the access to the infrastructure necessary to investigate the large basins of the Danube and Rhine as well as relevant Mediterranean basins as case studies, and puts major efforts on stakeholder dialogue and support. Particularly, the EU Water Framework Directive (WFD) Common Implementation Strategy (CIS) working groups, International River Commissions, and water works associations are directly supported with consistent guidance for the early detection, identification, prioritisation, and abatement of chemicals in the water cycle. SOLUTIONS will give a specific emphasis on concepts and tools for the impact and risk assessment of complex mixtures of emerging pollutants, their metabolites and transformation products. Analytical and effect-based screening tools will be applied together with ecological assessment tools for the identification of toxicants and their impacts. The SOLUTIONS approach is expected to provide transparent and evidence-based candidates or River Basin Specific Pollutants in the case study basins and to assist future review of priority pollutants under the WFD as well as potential abatement options. Copyright © 2014 Elsevier B.V. All rights reserved.
Analysis and testing of a bridge deck reinforced with GFRP rebars : final report, April 3, 2007.
DOT National Transportation Integrated Search
2007-04-03
The present project had two main objectives, to experimentally and analytically investigate a bridge deck reinforced with glass : fiber reinforced polymer rebars, and to perform durability tests on four rebar types. : An analytical investigation was ...
Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray
It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.
HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.
2015-05-01
This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.
Fleet management performance monitoring.
DOT National Transportation Integrated Search
2013-05-01
The principle goal of this project was to enhance and expand the analytical modeling methodology previously developed as part of the Fleet Management Criteria: Disposal Points and Utilization Rates project completed in 2010. The enhanced and ex...
Tennessee long-range transportation plan : project evaluation system
DOT National Transportation Integrated Search
2005-12-01
The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...
Scattering of massless fermions by Schwarzschild and Reissner-Nordström black holes
NASA Astrophysics Data System (ADS)
Sporea, Ciprian A.
2017-12-01
We study the scattering of massless Dirac fermions by Schwarzschild and Reissner-Nordström black holes. This is done by applying partial wave analysis to the scattering modes obtained after solving the massless Dirac equation in the asymptotic regions of the two black hole geometries. We successfully obtain analytic phase shifts, with the help of which the scattering cross section is computed. The glory and spiral scattering phenomena are shown to be present, as in the case of massive fermion scattering by black holes. Supported by a grant of the Ministry of National Education and Scientific Research, RDI Programme for Space Technology and Advanced Research - STAR, project number 181/20.07.2017
The Da Vinci European BioBank: A Metabolomics-Driven Infrastructure
Carotenuto, Dario; Luchinat, Claudio; Marcon, Giordana; Rosato, Antonio; Turano, Paola
2015-01-01
We present here the organization of the recently-constituted da Vinci European BioBank (daVEB, https://www.davincieuropeanbiobank.org/it). The biobank was created as an infrastructure to support the activities of the Fiorgen Foundation (http://www.fiorgen.net/), a nonprofit organization that promotes research in the field of pharmacogenomics and personalized medicine. The way operating procedures concerning samples and data have been developed at daVEB largely stems from the strong metabolomics connotation of Fiorgen and from the involvement of the scientific collaborators of the foundation in international/European projects aimed to tackle the standardization of pre-analytical procedures and the promotion of data standards in metabolomics. PMID:25913579
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quirk, W.J.; Canada, J.; de Vore, L.
This monthly report of research activities at Lawrence Livermore Laboratory highlights three different research programs. First, the Forensic Science Center supports a broad range of analytical techniques that focus on detecting and analyzing chemical, biological, and nuclear species. Analyses are useful in the areas of nonproliferation, counterterrorism, and law enforcement. Second, starting in 1977, the laboratory initiated a series of studies to understand a high incidence of melanoma among employees. Continued study shows that mortality from this disease has decreased from the levels seen in the 1980`s. Third, to help coordinate the laboratory`s diverse research projects that can provide bettermore » healthcare tools to the public, the lab is creating the new Center for Healthcare Technologies.« less
Planetary and Primitive Object Strength Measurement and Sampling Apparatus
NASA Technical Reports Server (NTRS)
Ahrens, Thomas J.
1995-01-01
Support is requested for continuation of a program of dynamic impact (harpoon) coring of planetary, comet, or asteroid surface materials. We have previously demonstrated that good quality cores are obtainable for planetary materials with compressive strengths less than 200 MPa. Since the dynamics of penetration are observable on a Discovery class spacecraft, which images the sampling operation, these data can be used with a model developed under this project, to measure in-situ strength and frictional strength of the crust of the object. During the last year we have developed a detailed analytic model of penetrator mechanics. Progress is reported for the solid penetrators experiments, the CIT penetrator model, and the impact spall sampling apparatus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
Fast analytical scatter estimation using graphics processing units.
Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris
2015-01-01
To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.
Enhanced spot preparation for liquid extractive sampling and analysis
Van Berkel, Gary J.; King, Richard C.
2015-09-22
A method for performing surface sampling of an analyte, includes the step of placing the analyte on a stage with a material in molar excess to the analyte, such that analyte-analyte interactions are prevented and the analyte can be solubilized for further analysis. The material can be a matrix material that is mixed with the analyte. The material can be provided on a sample support. The analyte can then be contacted with a solvent to extract the analyte for further processing, such as by electrospray mass spectrometry.
Get to Know Your Neighborhood Pest: An Interdisciplinary Project for Middle School Students.
ERIC Educational Resources Information Center
Zipko, Stephen J.
1982-01-01
Describes an interdisciplinary, month-long minicourse project focusing on the gypsy moth. The project provided students with opportunities to develop analytical and problem-solving skills while studying content from entomology, botany, chemistry, toxicology, ecology, math, art, law, political science, history, English, consumer studies, and…
Intelligent Vehicle Mobility M&S Capability Development (FY13 innovation Project) (Briefing Charts)
2014-05-19
Intelligent Vehicle Mobility M&S Capability Development (FY13 Innovation Project) P. Jayakumar and J. Raymond, Analytics 19 May 2014...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paramsithy Jayakumar ; J Raymond 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING
NASA Astrophysics Data System (ADS)
Foufoula-Georgiou, E.; Tessler, Z. D.; Brondizio, E.; Overeem, I.; Renaud, F.; Sebesvari, Z.; Nicholls, R. J.; Anthony, E.
2016-12-01
Deltas are highly dynamic and productive environments: they are food baskets of the world, home to biodiverse and rich ecosystems, and they play a central role in food and water security. However, they are becoming increasingly vulnerable to risks arising from human activities, land subsidence, regional water management, global sea-level rise, and climate extremes. Our Belmont Forum DELTAS project (BF-DELTAS: Catalyzing actions towards delta sustainability) encompasses an international network of interdisciplinary research collaborators with focal areas in the Mekong, Ganges Brahmaputra, and the Amazon deltas. The project is organized around five main modules: (1) developing an analytical framework for assessing delta vulnerability and scenarios of change (Delta-SRES), (2) developing an open-acess, science-based integrative modeling framework for risk assessment and decision support (Delta-RADS), (3) developing tools to support quantitative mapping of the bio-physical and socio-economic environments of deltas and consolidate bio-physical and social data within shared data repositories (Delta-DAT), (4) developing Global Delta Vulnerability Indices (Delta-GDVI) that capture current and projected scenarios for major deltas around the world , and (5) collaborating with regional stakeholders to put the science, modeling, and data into action (Delta-ACT). In this talk, a research summary will be presented on three research domains around which significant collaborative work was developed: advancing biophysical classification of deltas, understanding deltas as coupled socio-ecological systems, and analyzing and informing social and environmental vulnerabilities in delta regions.
Development of a second generation biofiltration system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleinheinz, G.T.; McGinnis, G.D.; Niemi, B.A.
1999-07-01
Biofiltration utilizes microbial processes which are immobilized on a solid support to biodegrade contaminants in air. Biofilters traditionally have been utilized in applications where there is a high volume of air containing low levels of compounds. There are several operational problems biofilters are currently encountering. Some of these problems include systems which are very large, microbial breakdown of the solid support, cycling of compounds onto the biofilters (uneven amounts of compounds in the air), and very short residence times in the biofiltration units. This project was undertaken to determine the feasibility of using physical/chemical methods to adsorb and then desorbmore » analytes to convert a dilute, high volume air stream to a more concentrated low volume air stream. The chemical/physical (adsorption/desorption) system will also serve to provide a relatively consistent air stream to the biofiltration units in order to alleviate the perturbations to the system as a result of uneven analyte concentrations. The ability to concentrate a dilute air stream and provide a constant stream of VOCs to the biofiltration unit will allow for smaller, more efficient, and more economical biofilters. Two years of laboratory studies and initial pilot-scale trials on these coupled systems have shown that they are indeed able to efficiently concentrate dilute streams, and the coupled biofilters are able to remove 90+% of the VOCs from the adsorption/desorption unit.« less
Analytical Sociology: A Bungean Appreciation
ERIC Educational Resources Information Center
Wan, Poe Yu-ze
2012-01-01
Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve…
Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory
ERIC Educational Resources Information Center
Jensen, Mark B.
2015-01-01
An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…
Review and assessment of the HOST turbine heat transfer program
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.
ALCF Data Science Program: Productive Data-centric Supercomputing
NASA Astrophysics Data System (ADS)
Romero, Nichols; Vishwanath, Venkatram
The ALCF Data Science Program (ADSP) is targeted at big data science problems that require leadership computing resources. The goal of the program is to explore and improve a variety of computational methods that will enable data-driven discoveries across all scientific disciplines. The projects will focus on data science techniques covering a wide area of discovery including but not limited to uncertainty quantification, statistics, machine learning, deep learning, databases, pattern recognition, image processing, graph analytics, data mining, real-time data analysis, and complex and interactive workflows. Project teams will be among the first to access Theta, ALCFs forthcoming 8.5 petaflops Intel/Cray system. The program will transition to the 200 petaflop/s Aurora supercomputing system when it becomes available. In 2016, four projects have been selected to kick off the ADSP. The selected projects span experimental and computational sciences and range from modeling the brain to discovering new materials for solar-powered windows to simulating collision events at the Large Hadron Collider (LHC). The program will have a regular call for proposals with the next call expected in Spring 2017.http://www.alcf.anl.gov/alcf-data-science-program This research used resources of the ALCF, which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357.
Selected Analytical Methods for Environmental Remediation ...
The US Environmental Protection Agency’s Office of Research and Development (ORD) conducts cutting-edge research that provides the underpinning of science and technology for public health and environmental policies and decisions made by federal, state and other governmental organizations. ORD’s six research programs identify the pressing research needs with input from EPA offices and stakeholders. Research is conducted by ORD’s 3 labs, 4 centers, and 2 offices located in 14 facilities. The EPA booth at APHL will have several resources available to attendees, mostly in the form of print materials, that showcase our research labs, case studies of research activities, and descriptions of specific research projects. The Selected Analytical Methods for Environmental Remediation and Recovery (SAM), a library of selected methods that are helping to increase the nation's laboratory capacity to support large-scale emergency response operations, will be demoed by EPA scientists at the APHL Experience booth in the Exhibit Hall on Tuesday during the morning break. Please come to the EPA booth #309 for more information! To be on a loop at our ORD booth demo during APHL.
Quantum Quench Dynamics in the Transverse Field Ising Model at Non-zero Temperatures
NASA Astrophysics Data System (ADS)
Abeling, Nils; Kehrein, Stefan
The recently discovered Dynamical Phase Transition denotes non-analytic behavior in the real time evolution of quantum systems in the thermodynamic limit and has been shown to occur in different systems at zero temperature [Heyl et al., Phys. Rev. Lett. 110, 135704 (2013)]. In this talk we present the extension of the analysis to non-zero temperature by studying a generalized form of the Loschmidt echo, the work distribution function, of a quantum quench in the transverse field Ising model. Although the quantitative behavior at non-zero temperatures still displays features derived from the zero temperature non-analyticities, it is shown that in this model dynamical phase transitions do not exist if T > 0 . This is a consequence of the system being initialized in a thermal state. Moreover, we elucidate how the Tasaki-Crooks-Jarzynski relation can be exploited as a symmetry relation for a global quench or to obtain the change of the equilibrium free energy density. This work was supported through CRC SFB 1073 (Project B03) of the Deutsche Forschungsgemeinschaft (DFG).
Kockmann, Tobias; Trachsel, Christian; Panse, Christian; Wahlander, Asa; Selevsek, Nathalie; Grossmann, Jonas; Wolski, Witold E; Schlapbach, Ralph
2016-08-01
Quantitative mass spectrometry is a rapidly evolving methodology applied in a large number of omics-type research projects. During the past years, new designs of mass spectrometers have been developed and launched as commercial systems while in parallel new data acquisition schemes and data analysis paradigms have been introduced. Core facilities provide access to such technologies, but also actively support the researchers in finding and applying the best-suited analytical approach. In order to implement a solid fundament for this decision making process, core facilities need to constantly compare and benchmark the various approaches. In this article we compare the quantitative accuracy and precision of current state of the art targeted proteomics approaches single reaction monitoring (SRM), parallel reaction monitoring (PRM) and data independent acquisition (DIA) across multiple liquid chromatography mass spectrometry (LC-MS) platforms, using a readily available commercial standard sample. All workflows are able to reproducibly generate accurate quantitative data. However, SRM and PRM workflows show higher accuracy and precision compared to DIA approaches, especially when analyzing low concentrated analytes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Conversion of an atomic to a molecular argon ion and low pressure argon relaxation
NASA Astrophysics Data System (ADS)
M, N. Stankov; A, P. Jovanović; V, Lj Marković; S, N. Stamenković
2016-01-01
The dominant process in relaxation of DC glow discharge between two plane parallel electrodes in argon at pressure 200 Pa is analyzed by measuring the breakdown time delay and by analytical and numerical models. By using the approximate analytical model it is found that the relaxation in a range from 20 to 60 ms in afterglow is dominated by ions, produced by atomic-to-molecular conversion of Ar+ ions in the first several milliseconds after the cessation of the discharge. This conversion is confirmed by the presence of double-Gaussian distribution for the formative time delay, as well as conversion maxima in a set of memory curves measured in different conditions. Finally, the numerical one-dimensional (1D) model for determining the number densities of dominant particles in stationary DC glow discharge and two-dimensional (2D) model for the relaxation are used to confirm the previous assumptions and to determine the corresponding collision and transport coefficients of dominant species and processes. Project supported by the Ministry of Education, Science and Technological Development of the Republic of Serbia (Grant No. ON171025).
Manufacturing data analytics using a virtual factory representation.
Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun
2017-01-01
Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.
Hervatis, Vasilis; Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-10-06
Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators' decision making. A deductive case study approach was applied to develop the conceptual model. The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach.
Loe, Alan; Barman, Linda; O'Donoghue, John; Zary, Nabil
2015-01-01
Background Preparing the future health care professional workforce in a changing world is a significant undertaking. Educators and other decision makers look to evidence-based knowledge to improve quality of education. Analytics, the use of data to generate insights and support decisions, have been applied successfully across numerous application domains. Health care professional education is one area where great potential is yet to be realized. Previous research of Academic and Learning analytics has mainly focused on technical issues. The focus of this study relates to its practical implementation in the setting of health care education. Objective The aim of this study is to create a conceptual model for a deeper understanding of the synthesizing process, and transforming data into information to support educators’ decision making. Methods A deductive case study approach was applied to develop the conceptual model. Results The analytics loop works both in theory and in practice. The conceptual model encompasses the underlying data, the quality indicators, and decision support for educators. Conclusions The model illustrates how a theory can be applied to a traditional data-driven analytics approach, and alongside the context- or need-driven analytics approach. PMID:27731840
Towards an Analytic Foundation for Network Architecture
2010-12-31
SUPPLEMENTARY NOTES N/A 14. ABSTRACT In this project, we develop the analytic tools of stochastic optimization for wireless network design and apply them...and Mung Chiang, “ DaVinci : Dynamically Adaptive Virtual Networks for a Customized Internet,” in Proc. ACM SIGCOMM CoNext Conference, December 2008
University Macro Analytic Simulation Model.
ERIC Educational Resources Information Center
Baron, Robert; Gulko, Warren
The University Macro Analytic Simulation System (UMASS) has been designed as a forecasting tool to help university administrators budgeting decisions. Alternative budgeting strategies can be tested on a computer model and then an operational alternative can be selected on the basis of the most desirable projected outcome. UMASS uses readily…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Covered are: analytical laboratory operations (ALO) sample receipt and control, ALO data report/package preparation review and control, single shell tank (PST) project sample tracking system, sample receiving, analytical balances, duties and responsibilities of sample custodian, sample refrigerator temperature monitoring, security, assignment of staff responsibilities, sample storage, data reporting, and general requirements for glassware.
Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits
NASA Astrophysics Data System (ADS)
Friedl, L.; Kiefer, D. A.; Turner, W.
2013-12-01
This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.
Developing accreditation for community based surgery: the Irish experience.
Ní Riain, Ailís; Collins, Claire; O'Sullivan, Tony
2018-02-05
Purpose Carrying out minor surgery procedures in the primary care setting is popular with patients, cost effective and delivers at least as good outcomes as those performed in the hospital setting. This paper aims to describe the central role of clinical leadership in developing an accreditation system for general practitioners (GPs) undertaking community-based surgery in the Irish national setting where no mandatory accreditation process currently exists. Design/methodology/approach In all, 24 GPs were recruited to the GP network. Ten pilot standards were developed addressing GPs' experience and training, clinical activity and practice supporting infrastructure and tested, using information and document review, prospective collection of clinical data and a practice inspection visit. Two additional components were incorporated into the project (patient satisfaction survey and self-audit). A multi-modal evaluation was undertaken. A majority of GPs was included at all stages of the project, in line with the principles of action learning. The steering group had a majority of GPs with relevant expertise and representation of all other actors in the minor surgery arena. The GP research network contributed to each stage of the project. The project lead was a GP with minor surgery experience. Quantitative data collected were analysed using Predictive Analytic SoftWare. Krueger's framework analysis approach was used to analyse the qualitative data. Findings A total of 9 GPs achieved all standards at initial review, 14 successfully completed corrective actions and 1 GP did not achieve the required standard. Standards were then amended to reflect findings and a supporting framework was developed. Originality/value The flexibility of the action-learning approach and the clinical leadership design allowed for the development of robust quality standards in a short timeframe.
Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter
2005-02-15
A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.
Large capacity oblique all-wing transport aircraft
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.; Phillips, James A.; Kennelly, Robert A., Jr.; Waters, Mark H.
1996-01-01
Dr. R. T. Jones first developed the theory for oblique wing aircraft in 1952, and in subsequent years numerous analytical and experimental projects conducted at NASA Ames and elsewhere have established that the Jones' oblique wing theory is correct. Until the late 1980's all proposed oblique wing configurations were wing/body aircraft with the wing mounted on a pivot. With the emerging requirement for commercial transports with very large payloads, 450-800 passengers, Jones proposed a supersonic oblique flying wing in 1988. For such an aircraft all payload, fuel, and systems are carried within the wing, and the wing is designed with a variable sweep to maintain a fixed subsonic normal Mach number. Engines and vertical tails are mounted on pivots supported from the primary structure of the wing. The oblique flying wing transport has come to be known as the Oblique All-Wing (OAW) transport. This presentation gives the highlights of the OAW project that was to study the total concept of the OAW as a commercial transport.
Flat-plate solar array project. Volume 7: Module encapsulation
NASA Astrophysics Data System (ADS)
Cuddihy, E.; Coulbert, C.; Gupta, A.; Liang, R.
1986-10-01
The objective of the Encapsulation Task was to develop, demonstrate, and qualify photovoltaic (PV) module encapsulation systems that would provide 20 year (later decreased to 30 year) life expectancies in terrestrial environments, and which would be compatible with the cost and performance goals of the Flat-Plate Solar Array (FSA) Project. The scope of the Encapsulation Task included the identification, development, and evaluation of material systems and configurations required to support and protect the optically and electrically active solar cell circuit components in the PV module operating environment. Encapsulation material technologies summarized include the development of low cost ultraviolet protection techniques, stable low cost pottants, soiling resistant coatings, electrical isolation criteria, processes for optimum interface bonding, and analytical and experimental tools for evaluating the long term durability and structural adequacy of encapsulated modules. Field testing, accelerated stress testing, and design studies have demonstrated that encapsulation materials, processes, and configurations are available that meet the FSA cost and performance goals.
Flat-plate solar array project. Volume 7: Module encapsulation
NASA Technical Reports Server (NTRS)
Cuddihy, E.; Coulbert, C.; Gupta, A.; Liang, R.
1986-01-01
The objective of the Encapsulation Task was to develop, demonstrate, and qualify photovoltaic (PV) module encapsulation systems that would provide 20 year (later decreased to 30 year) life expectancies in terrestrial environments, and which would be compatible with the cost and performance goals of the Flat-Plate Solar Array (FSA) Project. The scope of the Encapsulation Task included the identification, development, and evaluation of material systems and configurations required to support and protect the optically and electrically active solar cell circuit components in the PV module operating environment. Encapsulation material technologies summarized include the development of low cost ultraviolet protection techniques, stable low cost pottants, soiling resistant coatings, electrical isolation criteria, processes for optimum interface bonding, and analytical and experimental tools for evaluating the long term durability and structural adequacy of encapsulated modules. Field testing, accelerated stress testing, and design studies have demonstrated that encapsulation materials, processes, and configurations are available that meet the FSA cost and performance goals.
Velocity Profiles of Slow Blood Flow in a Narrow Tube
NASA Astrophysics Data System (ADS)
Chen, Jinyu; Huang, Zuqia; Zhuang, Fengyuan; Zhang, Hui
1998-04-01
A fractal model is introduced into the slow blood motion. When blood flows slowly in a narrow tube, red cell aggregation results in the formation of an approximately cylindrical core of red cells. By introducing the fractal model and using the power law relation between area fraction φ and distance from tube axis ρ, rigorous velocity profiles of the fluid in and outside the aggregated core and of the core itself are obtained analytically for different fractal dimensions. It shows a blunted velocity distribution for a relatively large fractal dimension (D ˜ 2), which can be observed in normal blood; a pathological velocity profile for moderate dimension (D = 1), which is similar to the Segre-Silberberg effect; and a parabolic profile for negligible red cell concentration (D = 0), which likes in the Poiseuille flow. The project supported by the National Basic Research Project "Nonlinear Science", National Natural Science Foundation of China and the State Education Commission through the Foundation of Doctoral Training
Incompressible flow simulations on regularized moving meshfree grids
NASA Astrophysics Data System (ADS)
Vasyliv, Yaroslav; Alexeev, Alexander
2017-11-01
A moving grid meshfree solver for incompressible flows is presented. To solve for the flow field, a semi-implicit approximate projection method is directly discretized on meshfree grids using General Finite Differences (GFD) with sharp interface stencil modifications. To maintain a regular grid, an explicit shift is used to relax compressed pseudosprings connecting a star node to its cloud of neighbors. The following test cases are used for validation: the Taylor-Green vortex decay, the analytic and modified lid-driven cavities, and an oscillating cylinder enclosed in a container for a range of Reynolds number values. We demonstrate that 1) the grid regularization does not impede the second order spatial convergence rate, 2) the Courant condition can be used for time marching but the projection splitting error reduces the convergence rate to first order, and 3) moving boundaries and arbitrary grid distortions can readily be handled. Financial support provided by the National Science Foundation (NSF) Graduate Research Fellowship, Grant No. DGE-1148903.
AMTD - Advanced Mirror Technology Development in Mechanical Stability
NASA Technical Reports Server (NTRS)
Knight, J. Brent
2015-01-01
Analytical tools and processes are being developed at NASA Marshal Space Flight Center in support of the Advanced Mirror Technology Development (AMTD) project. One facet of optical performance is mechanical stability with respect to structural dynamics. Pertinent parameters are: (1) the spacecraft structural design, (2) the mechanical disturbances on-board the spacecraft (sources of vibratory/transient motion such as reaction wheels), (3) the vibration isolation systems (invariably required to meet future science needs), and (4) the dynamic characteristics of the optical system itself. With stability requirements of future large aperture space telescopes being in the lower Pico meter regime, it is paramount that all sources of mechanical excitation be considered in both feasibility studies and detailed analyses. The primary objective of this paper is to lay out a path to perform feasibility studies of future large aperture space telescope projects which require extreme stability. To get to that end, a high level overview of a structural dynamic analysis process to assess an integrated spacecraft and optical system is included.
NASA Fundamental Remote Sensing Science Research Program
NASA Technical Reports Server (NTRS)
1984-01-01
The NASA Fundamental Remote Sensing Research Program is described. The program provides a dynamic scientific base which is continually broadened and from which future applied research and development can draw support. In particular, the overall objectives and current studies of the scene radiation and atmospheric effect characterization (SRAEC) project are reviewed. The SRAEC research can be generically structured into four types of activities including observation of phenomena, empirical characterization, analytical modeling, and scene radiation analysis and synthesis. The first three activities are the means by which the goal of scene radiation analysis and synthesis is achieved, and thus are considered priority activities during the early phases of the current project. Scene radiation analysis refers to the extraction of information describing the biogeophysical attributes of the scene from the spectral, spatial, and temporal radiance characteristics of the scene including the atmosphere. Scene radiation synthesis is the generation of realistic spectral, spatial, and temporal radiance values for a scene with a given set of biogeophysical attributes and atmospheric conditions.
Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Heine, Christian; Weber, Gunther H.
2012-05-04
Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less
NASA Astrophysics Data System (ADS)
Oluz, Zehra; Nayab, Sana; Kursun, Talya Tugana; Caykara, Tuncer; Yameen, Basit; Duran, Hatice
Azo initiator modified surface of silica nanoparticles were coated via reversible addition-fragmentation polymerization (RAFT) of methacrylic acid and ethylene glycol dimethacrylate using 2-phenylprop 2-yl dithobenzoate as chain transfer agent. Using L-phenylalanine anilide as template during polymerization led molecularly imprinted nanoparticles. RAFT polymerization offers an efficient control of grafting process, while molecularly imprinted polymers shows enhanced capacity as sensor. L-phenylalanine anilide imprinted silica particles were characterized by X-Ray photoelectron spectroscopy (XPS), atomic force microscopy (AFM). Performances of the particles were followed by surface plasmon resonance spectroscopy (SPR) after coating the final product on gold deposited glass substrate against four different analogous of analyte molecules: D-henylalanine anilide, L-tyrosine, L-tryptophan and L-phenylalanine. Characterizations indicated that silica particles coated with polymer layer do contain binding sites for L-phenylalanine anilide, and are highly selective for the molecule of interest. This project was supported by TUBITAK (Project No:112M804).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less
World Energy Projection System Plus Model Documentation: Coal Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Transportation Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Residential Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Refinery Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Main Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Electricity Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
USDA-ARS?s Scientific Manuscript database
Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...
ERIC Educational Resources Information Center
Mavroudi, Anna; Giannakos, Michail; Krogstie, John
2018-01-01
Learning Analytics (LA) and adaptive learning are inextricably linked since they both foster technology-supported learner-centred education. This study identifies developments focusing on their interplay and emphasises insufficiently investigated directions which display a higher innovation potential. Twenty-one peer-reviewed studies are…
Understanding, Evaluating, and Supporting Self-Regulated Learning Using Learning Analytics
ERIC Educational Resources Information Center
Roll, Ido; Winne, Philip H.
2015-01-01
Self-regulated learning is an ongoing process rather than a single snapshot in time. Naturally, the field of learning analytics, focusing on interactions and learning trajectories, offers exciting opportunities for analyzing and supporting self-regulated learning. This special section highlights the current state of research at the intersection of…
Human Capital Analytics to Manage the Army Officer Population
2017-06-09
employees from spending time and energy on a career path projected to be obsolete. Instead, managers are able to use data to show employees where they...HUMAN CAPITAL ANALYTICS TO MANAGE THE ARMY OFFICER POPULATION A thesis presented to the Faculty of the U.S. Army Command and...From - To) AUG 2016 – JUNE 2017 4. TITLE AND SUBTITLE Human Capital Analytics to Manage the Army Officer Population 5a. CONTRACT NUMBER 5b
An Analytical Method for Measuring Competence in Project Management
ERIC Educational Resources Information Center
González-Marcos, Ana; Alba-Elías, Fernando; Ordieres-Meré, Joaquín
2016-01-01
The goal of this paper is to present a competence assessment method in project management that is based on participants' performance and value creation. It seeks to close an existing gap in competence assessment in higher education. The proposed method relies on information and communication technology (ICT) tools and combines Project Management…
Sex Differences in Objective and Projective Dependency Tests: A Meta-Analytic Review.
ERIC Educational Resources Information Center
Bornstein, Robert F.
1995-01-01
A meta-analysis of 97 studies published since 1950 that assessed sex differences in scores on objective and projective dependency tests indicated that women consistently obtained higher dependency scores on objective tests, and men obtained higher scores on projective tests. Findings are discussed in terms of sex role socialization. (SLD)
Chilled to the bone: embodied countertransference and unspoken traumatic memories.
Zoppi, Luisa
2017-11-01
Starting from a deeply challenging experience of early embodied countertransference in a first encounter with a new patient, the author explores the issues it raised. Such moments highlight projective identification as well as what Stone (2006) has described as 'embodied resonance in the countertransference'. In these powerful experiences linear time and subject boundaries are altered, and this leads to central questions about analytic work. As well as discussing the uncanny experience at the very beginning of an analytic encounter and its challenges for the analytic field, the author considers 'the time horizon of analytic process' (Hogenson ), the relationship between 'moments of complexity and analytic boundaries' (Cambray ) and the role of mirror neurons in intersubjective experience. © 2017, The Society of Analytical Psychology.
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
Exploration Laboratory Analysis FY13
NASA Technical Reports Server (NTRS)
Krihak, Michael; Perusek, Gail P.; Fung, Paul P.; Shaw, Tianna, L.
2013-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk, which is stated as the Risk of Inability to Adequately Treat an Ill or Injured Crew Member, and ExMC Gap 4.05: Lack of minimally invasive in-flight laboratory capabilities with limited consumables required for diagnosing identified Exploration Medical Conditions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability in future exploration missions. Mission architecture poses constraints on equipment and procedures that will be available to treat evidence-based medical conditions according to the Space Medicine Exploration Medical Conditions List (SMEMCL), and to perform human research studies on the International Space Station (ISS) that are supported by the Human Health and Countermeasures (HHC) element. Since there are significant similarities in the research and medical operational requirements, ELA hardware development has emerged as a joint effort between ExMC and HHC. In 2012, four significant accomplishments were achieved towards the development of exploration laboratory analysis for medical diagnostics. These achievements included (i) the development of high priority analytes for research and medical operations, (ii) the development of Level 1 functional requirements and concept of operations documentation, (iii) the selection and head-to-head competition of in-flight laboratory analysis instrumentation, and (iv) the phase one completion of the Small Business Innovation Research (SBIR) projects under the topic Smart Phone Driven Blood-Based Diagnostics. To utilize resources efficiently, the associated documentation and advanced technologies were integrated into a single ELA plan that encompasses ExMC and HHC development efforts. The requirements and high priority analytes was used in the selection of the four in-flight laboratory analysis performers. Based upon the competition results, a down select process will be performed in the upcoming year. Looking ahead, this unified effort has positioned each element for an in-flight lab analysis demonstration of select diagnostics measurements in the 2015 timeframe.
Dwyer, Johanna T.; Picciano, Mary Frances; Betz, Joseph M.; Fisher, Kenneth D.; Saldanha, Leila G.; Yetley, Elizabeth A.; Coates, Paul M.; Milner, John A.; Whitted, Jackie; Burt, Vicki; Radimer, Kathy; Wilger, Jaimie; Sharpless, Katherine E.; Holden, Joanne M.; Andrews, Karen; Roseland, Janet; Zhao, Cuiwei; Schweitzer, Amy; Harnly, James; Wolf, Wayne R.; Perry, Charles R.
2013-01-01
Although an estimated 50% of adults in the United States consume dietary supplements, analytically substantiated data on their bioactive constituents are sparse. Several programs funded by the Office of Dietary Supplements (ODS) at the National Institutes of Health enhance dietary supplement database development and help to better describe the quantitative and qualitative contributions of dietary supplements to total dietary intakes. ODS, in collaboration with the United States Department of Agriculture, is developing a Dietary Supplement Ingredient Database (DSID) verified by chemical analysis. The products chosen initially for analytical verification are adult multivitamin-mineral supplements (MVMs). These products are widely used, analytical methods are available for determining key constituents, and a certified reference material is in development. Also MVMs have no standard scientific, regulatory, or marketplace definitions and have widely varying compositions, characteristics, and bioavailability. Furthermore, the extent to which actual amounts of vitamins and minerals in a product deviate from label values is not known. Ultimately, DSID will prove useful to professionals in permitting more accurate estimation of the contribution of dietary supplements to total dietary intakes of nutrients and better evaluation of the role of dietary supplements in promoting health and well-being. ODS is also collaborating with the National Center for Health Statistics to enhance the National Health and Nutrition Examination Survey dietary supplement label database. The newest ODS effort explores the feasibility and practicality of developing a database of all dietary supplement labels marketed in the US. This article describes these and supporting projects. PMID:25346570
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula
2017-02-01
Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.
Predictive Analytics to Support Real-Time Management in Pathology Facilities.
Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar
2016-01-01
Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses.
A Fuzzy-Based Decision Support Model for Selecting the Best Dialyser Flux in Haemodialysis.
Oztürk, Necla; Tozan, Hakan
2015-01-01
Decision making is an important procedure for every organization. The procedure is particularly challenging for complicated multi-criteria problems. Selection of dialyser flux is one of the decisions routinely made for haemodialysis treatment provided for chronic kidney failure patients. This study provides a decision support model for selecting the best dialyser flux between high-flux and low-flux dialyser alternatives. The preferences of decision makers were collected via a questionnaire. A total of 45 questionnaires filled by dialysis physicians and nephrologists were assessed. A hybrid fuzzy-based decision support software that enables the use of Analytic Hierarchy Process (AHP), Fuzzy Analytic Hierarchy Process (FAHP), Analytic Network Process (ANP), and Fuzzy Analytic Network Process (FANP) was used to evaluate the flux selection model. In conclusion, the results showed that a high-flux dialyser is the best. option for haemodialysis treatment.
Ferranti, Jeffrey M; Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif
2010-01-01
Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems.
Predictive Analytics to Support Real-Time Management in Pathology Facilities
Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar
2016-01-01
Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses. PMID:28269873
Langman, Matthew K; Tanaka, David; McCall, Jonathan; Ahmad, Asif
2010-01-01
Healthcare is increasingly dependent upon information technology (IT), but the accumulation of data has outpaced our capacity to use it to improve operating efficiency, clinical quality, and financial effectiveness. Moreover, hospitals have lagged in adopting thoughtful analytic approaches that would allow operational leaders and providers to capitalize upon existing data stores. In this manuscript, we propose a fundamental re-evaluation of strategic IT investments in healthcare, with the goal of increasing efficiency, reducing costs, and improving outcomes through the targeted application of health analytics. We also present three case studies that illustrate the use of health analytics to leverage pre-existing data resources to support improvements in patient safety and quality of care, to increase the accuracy of billing and collection, and support emerging health issues. We believe that such active investment in health analytics will prove essential to realizing the full promise of investments in electronic clinical systems. PMID:20190055
A Field Study Program in Analytical Chemistry for College Seniors.
ERIC Educational Resources Information Center
Langhus, D. L.; Flinchbaugh, D. A.
1986-01-01
Describes an elective field study program at Moravian College (Pennsylvania) in which seniors in analytical chemistry obtain first-hand experience at Bethlehem Steel Corporation. Discusses the program's planning phase, some method development projects done by students, experiences received in laboratory operations, and the evaluation of student…
Reimagining Khan Analytics for Student Coaches
ERIC Educational Resources Information Center
Cunningham, Jim
2015-01-01
In this paper, I describe preliminary work on a new research project in learning analytics at Arizona State University. In conjunction with an innovative remedial mathematics course using Khan Academy and student coaches, this study seeks to measure the effectiveness of visualized data in assisting student coaches as they help remedial math…
Exploratory Analysis in Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
2013-12-10
NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Major Sean Lyons 5e. TASK NUMBER 5f. WORK...Advance Research Projects Agency DOD Department of Defense FM Field Manual IC Intelligence Community IO Information Operations IP Internet...Utah, Central Intelligence Agency funding of the Recorded Future Company, and Defense Advanced Research Projects Agency, XDATA project . 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
D.K. Morton
2011-09-01
Following the defunding of the Yucca Mountain Project, it is reasonable to assume that commercial used fuel will remain in storage for the foreseeable future. This report proposes supplementing the ongoing research and development work related to potential degradation of used fuel, baskets, poisons, and storage canisters during an extended period of storage with a parallel path. This parallel path can assure criticality safety during transportation by implementing a concept that achieves moderator exclusion (no in-leakage of moderator into the used fuel cavity). Using updated risk assessment insights for additional technical justification and relying upon a component inside of themore » transportation cask that provides a watertight function, a strong argument can be made that moderator intrusion is not credible and should not be a required assumption for criticality evaluations during normal conditions of transportation. A demonstrating testing program supporting a detailed analytical effort as well as updated risk assessment insights can provide the basis for moderator exclusion during hypothetical accident conditions. This report also discusses how this engineered concept can support the goal of standardized transportation.« less
Transforming an EPA QA/R-2 quality management plan into an ISO 9002 quality management system.
Kell, R A; Hedin, C M; Kassakhian, G H; Reynolds, E S
2001-01-01
The Environmental Protection Agency's (EPA) Office of Emergency and Remedial Response (OERR) requires environmental data of known quality to support Superfund hazardous waste site projects. The Quality Assurance Technical Support (QATS) Program is operated by Shaw Environmental and Infrastructure, Inc. to provide EPA's Analytical Operations Center (AOC) with performance evaluation samples, reference materials, on-site laboratory auditing capabilities, data audits (including electronic media data audits), methods development, and other support services. The new QATS contract awarded in November 2000 required that the QATS Program become ISO 9000 certified. In a first for an EPA contractor, the QATS staff and management successfully transformed EPA's QA/R-2 type Quality Management Plan into a Quality Management System (QMS) that complies with the requirements of the internationally recognized ISO 9002 standard and achieved certification in the United States, Canada, and throughout Europe. The presentation describes how quality system elements of ISO 9002 were implemented on an already existing quality system. The psychological and organizational challenges of the culture change in QATS' day-to-day operations will be discussed for the benefit of other ISO 9000 aspirants.
Evaluation of support loss in micro-beam resonators: A revisit
NASA Astrophysics Data System (ADS)
Chen, S. Y.; Liu, J. Z.; Guo, F. L.
2017-12-01
This paper presents an analytical study on evaluation of support loss in micromechanical resonators undergoing in-plane flexural vibrations. Two-dimensional elastic wave theory is used to determine the energy transmission from the vibrating resonator to the support. Fourier transform and Green's function technique are adopted to solve the problem of wave motions on the surface of the support excited by the forces transmitted by the resonator onto the support. Analytical expressions of support loss in terms of quality factor, taking into account distributed normal stress and shear stress in the attachment region, and coupling between the normal stress and shear stress as well as material disparity between the support and the resonator, have been derived. Effects of geometry of micro-beam resonators, and material dissimilarity between support and resonator on support loss are examined. Numerical results show that 'harder resonator' and 'softer support' combination leads to larger support loss. In addition, the Perfectly Matched Layer (PML) numerical simulation technique is employed for validation of the proposed analytical model. Comparing with results of quality factor obtained by PML technique, we find that the present model agrees well with the results of PML technique and the pure-shear model overestimates support loss noticeably, especially for resonators with small aspect ratio and large material dissimilarity between the support and resonator.
Biosphere 2: a prototype project for a permanent and evolving life system for Mars base.
Nelson, M; Allen, J P; Dempster, W F
1992-01-01
As part of the ground-based preparation for creating long-term life systems needed for space habitation and settlement, Space Biospheres Ventures (SBV) is undertaking the Biosphere 2 project near Oracle, Arizona. Biosphere 2, currently under construction, is scheduled to commence its operations in 1991 with a two-year closure period with a crew of eight people. Biosphere 2 is a facility which will be essentialy materially-closed to exchange with the outside environment. It is open to information and energy flow. Biosphere 2 is designed to achieve a complex life-support system by the integration of seven areas or "biomes"--rainforest, savannah, desert, marsh, ocean, intensive agriculture and human habitat. Unique bioregenerative technologies, such as soil bed reactors for air purification, aquatic waste processing systems, real-time analytic systems and complex computer monitoring and control systems are being developed for the Biosphere 2 project. Its operation should afford valuable insight into the functioning of complex life systems necessary for long-term habitation in space. It will serve as an experimental ground-based prototype and testbed for the stable, permanent life systems needed for human exploration of Mars.
Hatch, Joseph R.; Bullock, John H.; Finkelman, Robert B.
2006-01-01
In 1999, the USGS initiated the National Coal Quality Inventory (NaCQI) project to address a need for quality information on coals that will be mined during the next 20-30 years. At the time this project was initiated, the publicly available USGS coal quality data was based on samples primarily collected and analyzed between 1973 and 1985. The primary objective of NaCQI was to create a database containing comprehensive, accurate and accessible chemical information on the quality of mined and prepared United States coals and their combustion byproducts. This objective was to be accomplished through maintaining the existing publicly available coal quality database, expanding the database through the acquisition of new samples from priority areas, and analysis of the samples using updated coal analytical chemistry procedures. Priorities for sampling include those areas where future sources of compliance coal are federally owned. This project was a cooperative effort between the U.S. Geological Survey (USGS), State geological surveys, universities, coal burning utilities, and the coal mining industry. Funding support came from the Electric Power Research Institute (EPRI) and the U.S. Department of Energy (DOE).
YouTube as a platform for publishing clinical skills training videos.
Topps, David; Helmer, Joyce; Ellaway, Rachel
2013-02-01
The means to share educational materials have grown considerably over the years, especially with the multitude of Internet channels available to educators. This article describes an innovative use of YouTube as a publishing platform for clinical educational materials.The authors posted online a series of short videos for teaching clinical procedures anticipating that they would be widely used. The project Web site attracted little traffic, alternatives were considered, and YouTube was selected for exploration as a publication channel. YouTube's analytics tools were used to assess uptake, and viewer comments were reviewed for specific feedback in support of evaluating and improving the materials posted.The uptake was much increased with 1.75 million views logged in the first 33 months. Viewer feedback, although limited, proved useful. In addition to improving uptake, this approach also relinquishes control over how materials are presented and how the analytics are generated. Open and anonymous access also limits relationships with end users.In summary, YouTube was found to provide many advantages over self-publication, particularly in terms of technical simplification, increased audience, discoverability, and analytics. In contrast to the transitory interest seen in most YouTube content, the channel has seen sustained popularity. YouTube's broadcast model diffused aspects of the relationship between educators and their learners, thereby limiting its use for more focused activities, such as continuing medical education.
World Energy Projection System Plus Model Documentation: Greenhouse Gases Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Natural Gas Module
2011-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
Hybrid FSAE Vehicle Realization
DOT National Transportation Integrated Search
2010-12-01
The goal of this multi-year project is to create a fully functional University of Idaho entry in the hybrid FSAE competition. Vehicle integration is underway as part of a variety of 2010-11 senior design projects. This leverages a variety of analytic...
World Energy Projection System Plus Model Documentation: District Heat Module
2017-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
World Energy Projection System Plus Model Documentation: Industrial Module
2016-01-01
This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.
Topological view of quantum tunneling coherent destruction
NASA Astrophysics Data System (ADS)
Bernardini, Alex E.; Chinaglia, Mariana
2017-08-01
Quantum tunneling of the ground and first excited states in a quantum superposition driven by a novel analytical configuration of a double-well (DW) potential is investigated. Symmetric and asymmetric potentials are considered as to support quantum mechanical zero mode and first excited state analytical solutions. Reporting about a symmetry breaking that supports the quantum conversion of a zero-mode stable vacuum into an unstable tachyonic quantum state, two inequivalent topological scenarios are supposed to drive stable tunneling and coherent tunneling destruction respectively. A complete prospect of the Wigner function dynamics, vector field fluxes and the time dependence of stagnation points is obtained for the analytical potentials that support stable and tachyonic modes.
Analytical Chemistry Developmental Work Using a 243Am Solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.
2015-02-24
This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .
Code of Federal Regulations, 2014 CFR
2014-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Code of Federal Regulations, 2012 CFR
2012-01-01
...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...
Analytical Chemistry Division. Annual progress report for period ending December 31, 1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1981-05-01
This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)
NASA Astrophysics Data System (ADS)
Hickey-Vargas, R.; Holbik, S. P.; Ryan, J. G.; MacDonald, J. H., Jr.; Beck, M.
2015-12-01
Geoscience faculty at the University of South Florida (USF), Florida Gulf Coast University (FCGU), Valencia College (VC) and Florida International University (FIU) have teamed to construct, test and disseminate geoscience curricula in which microbeam analytical instruments are operated by undergraduates, with data gathered in the classroom in real-time over the internet. Activities have been developed for courses Physical Geology, Oceanography, Earth Materials, Mineralogy/Petrology and Stratigraphy using the Scanning Electron Microscope (SEM) and Electron Probe Microanalyzer (EPMA) housed in the Florida Center for Analytical Electron Microscopy (FCAEM; https://fcaem.fiu.edu) at FIU. Students and faculty send research materials such as polished rock sections and microfossil mounts to FCAEM to be examined during their scheduled class and lab periods. Student control of both decision-making and selection of analytical targets is encouraged. The objective of these activities is to move students from passive learning to active, self-directed inquiry at an early stage in their undergraduate career, while providing access to advanced instruments that are not available at USF, FGCU and VC. These strategies strongly facilitate student interest in undergraduate research making use of these instruments and one positive outcome to date is an increased number of students undertaking independent research projects. Prior research by USF PI Jeff Ryan indicated that various barriers related to instrument access and use hindered interested geoscience faculty in making use of these tools and strategies. In the current project, post-doctoral researcher Dr. Sven Holbik acts as a facilitator, working directly with faculty from other institutions one-on-one to provide initial training and support, including on-site visits to field check classroom technology when needed. Several new educators and institutions will initiate classroom activities using FCAEM instrumentation this Fall.
Trnka, Radek; Lačev, Alek; Balcar, Karel; Kuška, Martin; Tavel, Peter
2016-01-01
The widely accepted two-dimensional circumplex model of emotions posits that most instances of human emotional experience can be understood within the two general dimensions of valence and activation. Currently, this model is facing some criticism, because complex emotions in particular are hard to define within only these two general dimensions. The present theory-driven study introduces an innovative analytical approach working in a way other than the conventional, two-dimensional paradigm. The main goal was to map and project semantic emotion space in terms of mutual positions of various emotion prototypical categories. Participants (N = 187; 54.5% females) judged 16 discrete emotions in terms of valence, intensity, controllability and utility. The results revealed that these four dimensional input measures were uncorrelated. This implies that valence, intensity, controllability and utility represented clearly different qualities of discrete emotions in the judgments of the participants. Based on this data, we constructed a 3D hypercube-projection and compared it with various two-dimensional projections. This contrasting enabled us to detect several sources of bias when working with the traditional, two-dimensional analytical approach. Contrasting two-dimensional and three-dimensional projections revealed that the 2D models provided biased insights about how emotions are conceptually related to one another along multiple dimensions. The results of the present study point out the reductionist nature of the two-dimensional paradigm in the psychological theory of emotions and challenge the widely accepted circumplex model. PMID:27148130
Second derivatives for approximate spin projection methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Lee M.; Hratchian, Hrant P., E-mail: hhratchian@ucmerced.edu
2015-02-07
The use of broken-symmetry electronic structure methods is required in order to obtain correct behavior of electronically strained open-shell systems, such as transition states, biradicals, and transition metals. This approach often has issues with spin contamination, which can lead to significant errors in predicted energies, geometries, and properties. Approximate projection schemes are able to correct for spin contamination and can often yield improved results. To fully make use of these methods and to carry out exploration of the potential energy surface, it is desirable to develop an efficient second energy derivative theory. In this paper, we formulate the analytical secondmore » derivatives for the Yamaguchi approximate projection scheme, building on recent work that has yielded an efficient implementation of the analytical first derivatives.« less
The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies
NASA Astrophysics Data System (ADS)
Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.
2016-08-01
The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
An Introduction to Project PRIME and CAMPUS MINNESOTA. Project PRIME Report, Number 2.
ERIC Educational Resources Information Center
Cordes, David C.
PRIME is an acronym for Planning Resources in Minnesota Education. The project's primary objective is to test the implementation of CAMPUS (Comprehensive Analytical Methods for Planning University Systems) in one State College, one Junior College, and in one school at the University of Minnesota. The CAMPUS model was developed by the Institute for…
The Challenge of Separating Effects of Simultaneous Education Projects on Student Achievement
ERIC Educational Resources Information Center
Ma, Xin; Ma, Lingling
2009-01-01
When multiple education projects operate in an overlapping or rear-ended manner, it is always a challenge to separate unique project effects on schooling outcomes. Our analysis represents a first attempt to address this challenge. A three-level hierarchical linear model (HLM) was presented as a general analytical framework to separate program…
Alcohol safety action projects evaluation of operations : data, table of results, and formulation
DOT National Transportation Integrated Search
1979-06-01
This volume contains the data used in the evaluation of 35 Alcohol Safety Action Projects implemented throughout the country. Historical background, discussion of analytic results and factors affecting impact detecion are contained in the document ti...
A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA
NASA Astrophysics Data System (ADS)
Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.
2015-12-01
Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service permits decision makers to investigate climate changes around the globe, inspect model trends, compare multiple reanalysis datasets, and variability.
NASA Technical Reports Server (NTRS)
Mcnulty, J. F.
1974-01-01
An analysis of the history and background of the Mars Project Viking is presented. The organization and functions of the engineering group responsible for the project are defined. The design and configuration of the proposed space vehicle are examined. Illustrations and tables of data are provided to complete the coverage of the project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DUFTY J W
This is the final report for the project 'Correlations in Confined Quantum Plasmas', NSF-DOE Partnership Grant DE FG02 07ER54946, 8/1/2007 - 7/30/2010. The research was performed in collaboration with a group at Christian Albrechts University (CAU), Kiel, Germany. That collaboration, almost 15 years old, was formalized during the past four years under this NSF-DOE Partnership Grant to support graduate students at the two institutions and to facilitate frequent exchange visits. The research was focused on exploring the frontiers of charged particle physics evolving from new experimental access to unusual states associated with confinement. Particular attention was paid to combined effectsmore » of quantum mechanics and confinement. A suite of analytical and numerical tools tailored to the specific inquiry has been developed and employed« less
Unified semiclassical approach to electronic transport from diffusive to ballistic regimes
NASA Astrophysics Data System (ADS)
Geng, Hao; Deng, Wei-Yin; Ren, Yue-Jiao; Sheng, Li; Xing, Ding-Yu
2016-09-01
We show that by integrating out the electric field and incorporating proper boundary conditions, a Boltzmann equation can describe electron transport properties, continuously from the diffusive to ballistic regimes. General analytical formulas of the conductance in D = 1,2,3 dimensions are obtained, which recover the Boltzmann-Drude formula and Landauer-Büttiker formula in the diffusive and ballistic limits, respectively. This intuitive and efficient approach can be applied to investigate the interplay of system size and impurity scattering in various charge and spin transport phenomena, when the quantum interference effect is not important. Project supported by the National Basic Research Program of China (Grant Nos. 2015CB921202 and 2014CB921103) and the National Natural Science Foundation of China (Grant No. 11225420).
Gamma Rays from Classical Novae
NASA Technical Reports Server (NTRS)
1997-01-01
NASA at the University of Chicago, provided support for a program of theoretical research into the nature of the thermonuclear outbursts of the classical novae and their implications for gamma ray astronomy. In particular, problems which have been addressed include the role of convection in the earliest stages of nova runaway, the influence of opacity on the characteristics of novae, and the nucleosynthesis expected to accompany nova outbursts on massive Oxygen-Neon-Magnesium (ONeMg) white dwarfs. In the following report, I will identify several critical projects on which considerable progress has been achieved and provide brief summaries of the results obtained:(1) two dimensional simulation of nova runaway; (2) nucleosynthesis of nova modeling; and (3) a quasi-analytic study of nucleosynthesis in ONeMg novae.
Homogenization theory for designing graded viscoelastic sonic crystals
NASA Astrophysics Data System (ADS)
Qu, Zhao-Liang; Ren, Chun-Yu; Pei, Yong-Mao; Fang, Dai-Ning
2015-02-01
In this paper, we propose a homogenization theory for designing graded viscoelastic sonic crystals (VSCs) which consist of periodic arrays of elastic scatterers embedded in a viscoelastic host material. We extend an elastic homogenization theory to VSC by using the elastic-viscoelastic correspondence principle and propose an analytical effective loss factor of VSC. The results of VSC and the equivalent structure calculated by using the finite element method are in good agreement. According to the relation of the effective loss factor to the filling fraction, a graded VSC plate is easily and quickly designed. Then, the graded VSC may have potential applications in the vibration absorption and noise reduction fields. Project supported by the National Basic Research Program of China (Grant No. 2011CB610301).
Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking
NASA Astrophysics Data System (ADS)
Groves, D. G.; Lempert, R.
2008-12-01
Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.
Implementing the correlated fermi gas nuclear model for quasielastic neutrino-nucleus scattering
NASA Astrophysics Data System (ADS)
Tockstein, Jameson
2017-09-01
When studying neutrino oscillations an understanding of charged current quasielastic (CCQE) neutrino-nucleus scattering is imperative. This interaction depends on a nuclear model as well as knowledge of form factors. Neutrino experiments, such as MiniBooNE, often use the Relativistic Fermi Gas (RFG) nuclear model. Recently, the Correlated Fermi Gas (CFG) nuclear model was suggested in, based on inclusive and exclusive scattering experiments at JLab. We implement the CFG model for CCQE scattering. In particular, we provide analytic expressions for this implementation that can be used to analyze current and future neutrino CCQE data. This project was supported through the Wayne State University REU program under NSF Grant PHY-1460853 and by the DOE Grant DE-SC0007983.
Analysis of renewable energy projects' implementation in Russia
NASA Astrophysics Data System (ADS)
Ratner, S. V.; Nizhegorodtsev, R. M.
2017-06-01
With the enactment in 2013 of a renewable energy scheme by contracting qualified power generation facilities working on renewable energy sources (RES), the process of construction and connection of such facilities to the Federal Grid Company has intensified in Russia. In 2013-2015, 93 projects of solar, wind, and small hydropower energy were selected on the basis of competitive bidding in the country with the purpose of subsequent support. Despite some technical and organizational problems and a time delay of some RES projects, in 2014-2015 five solar generating facilities with total capacity of 50 MW were commissioned, including 30 MW in Orenburg oblast. However, the proportion of successful projects is low and amounts to approximately 30% of the total number of announced projects. The purpose of this paper is to analyze the experience of implementation of renewable energy projects that passed through a competitive selection and gained the right to get a partial compensation for the construction and commissioning costs of RES generating facilities in the electric power wholesale market zone. The informational background for the study is corporate reports of project promoters, analytical and information materials of the Association NP Market Council, and legal documents for the development of renewable energy. The methodological base of the study is a theory of learning curves that assumes that cost savings in the production of high-tech products depends on the production growth rate (economy of scale) and gaining manufacturing experience (learning by doing). The study has identified factors that have a positive and a negative impact on the implementation of RES projects. Improvement of promotion measures in the renewable energy development in Russia corresponding to the current socio-economic situation is proposed.
ERIC Educational Resources Information Center
Gao, Ruomei
2015-01-01
In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…
ERIC Educational Resources Information Center
Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David
2014-01-01
Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...
DOT National Transportation Integrated Search
2012-03-01
This report introduces the design and implementation of a Web-based bridge information visual analytics system. This : project integrates Internet, multiple databases, remote sensing, and other visualization technologies. The result : combines a GIS ...
ERIC Educational Resources Information Center
Chen, Bodong
2015-01-01
In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…
Learning Analytics across a Statewide System
ERIC Educational Resources Information Center
Buyarski, Catherine; Murray, Jim; Torstrick, Rebecca
2017-01-01
This chapter explores lessons learned from two different learning analytics efforts at a large, public, multicampus university--one internally developed and one vended platform. It raises questions about how to best use analytics to support students while keeping students responsible for their own learning and success.
A multiscale filter for noise reduction of low-dose cone beam projections.
Yao, Weiguang; Farr, Jonathan B
2015-08-21
The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, exp(-x2/2σ(2)(f)) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of σ(f), which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ(2)(f)) is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024 × 768 pixels.
An Analysis of Machine- and Human-Analytics in Classification.
Tam, Gary K L; Kothari, Vivek; Chen, Min
2017-01-01
In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.
Data-Driven Geospatial Visual Analytics for Real-Time Urban Flooding Decision Support
NASA Astrophysics Data System (ADS)
Liu, Y.; Hill, D.; Rodriguez, A.; Marini, L.; Kooper, R.; Myers, J.; Wu, X.; Minsker, B. S.
2009-12-01
Urban flooding is responsible for the loss of life and property as well as the release of pathogens and other pollutants into the environment. Previous studies have shown that spatial distribution of intense rainfall significantly impacts the triggering and behavior of urban flooding. However, no general purpose tools yet exist for deriving rainfall data and rendering them in real-time at the resolution of hydrologic units used for analyzing urban flooding. This paper presents a new visual analytics system that derives and renders rainfall data from the NEXRAD weather radar system at the sewershed (i.e. urban hydrologic unit) scale in real-time for a Chicago stormwater management project. We introduce a lightweight Web 2.0 approach which takes advantages of scientific workflow management and publishing capabilities developed at NCSA (National Center for Supercomputing Applications), streaming data-aware semantic content management repository, web-based Google Earth/Map and time-aware KML (Keyhole Markup Language). A collection of polygon-based virtual sensors is created from the NEXRAD Level II data using spatial, temporal and thematic transformations at the sewershed level in order to produce persistent virtual rainfall data sources for the animation. Animated color-coded rainfall map in the sewershed can be played in real-time as a movie using time-aware KML inside the web browser-based Google Earth for visually analyzing the spatiotemporal patterns of the rainfall intensity in the sewershed. Such system provides valuable information for situational awareness and improved decision support during extreme storm events in an urban area. Our further work includes incorporating additional data (such as basement flooding events data) or physics-based predictive models that can be used for more integrated data-driven decision support.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fountain, Matthew S.; Fiskum, Sandra K.; Baldwin, David L.
This data package contains the K Basin sludge characterization results obtained by Pacific Northwest National Laboratory during processing and analysis of four sludge core samples collected from Engineered Container SCS-CON-210 in 2010 as requested by CH2M Hill Plateau Remediation Company. Sample processing requirements, analytes of interest, detection limits, and quality control sample requirements are defined in the KBC-33786, Rev. 2. The core processing scope included reconstitution of a sludge core sample distributed among four to six 4-L polypropylene bottles into a single container. The reconstituted core sample was then mixed and subsampled to support a variety of characterization activities. Additionalmore » core sludge subsamples were combined to prepare a container composite. The container composite was fractionated by wet sieving through a 2,000 micron mesh and a 500-micron mesh sieve. Each sieve fraction was sampled to support a suite of analyses. The core composite analysis scope included density determination, radioisotope analysis, and metals analysis, including the Waste Isolation Pilot Plant Hazardous Waste Facility Permit metals (with the exception of mercury). The container composite analysis included most of the core composite analysis scope plus particle size distribution, particle density, rheology, and crystalline phase identification. A summary of the received samples, core sample reconstitution and subsampling activities, container composite preparation and subsampling activities, physical properties, and analytical results are presented. Supporting data and documentation are provided in the appendices. There were no cases of sample or data loss and all of the available samples and data are reported as required by the Quality Assurance Project Plan/Sampling and Analysis Plan.« less
Upon the Shoulders of Giants: Open-Source Hardware and Software in Analytical Chemistry.
Dryden, Michael D M; Fobel, Ryan; Fobel, Christian; Wheeler, Aaron R
2017-04-18
Isaac Newton famously observed that "if I have seen further it is by standing on the shoulders of giants." We propose that this sentiment is a powerful motivation for the "open-source" movement in scientific research, in which creators provide everything needed to replicate a given project online, as well as providing explicit permission for users to use, improve, and share it with others. Here, we write to introduce analytical chemists who are new to the open-source movement to best practices and concepts in this area and to survey the state of open-source research in analytical chemistry. We conclude by considering two examples of open-source projects from our own research group, with the hope that a description of the process, motivations, and results will provide a convincing argument about the benefits that this movement brings to both creators and users.
Analysis of a virtual memory model for maintaining database views
NASA Technical Reports Server (NTRS)
Kinsley, Kathryn C.; Hughes, Charles E.
1992-01-01
This paper presents an analytical model for predicting the performance of a new support strategy for database views. This strategy, called the virtual method, is compared with traditional methods for supporting views. The analytical model's predictions of improved performance by the virtual method are then validated by comparing these results with those achieved in an experimental implementation.
Distribution factors for construction loads and girder capacity equations [project summary].
DOT National Transportation Integrated Search
2017-03-01
This project focused on the use of Florida I-beams (FIBs) in bridge construction. University of Florida researchers used analytical models and finite element analysis to update equations used in the design of bridges using FIBs. They were particularl...
Sampling and Analysis Plan - Guidance and Template v.4 - General Projects - 04/2014
This Sampling and Analysis Plan (SAP) guidance and template is intended to assist organizations in documenting the procedural and analytical requirements for one-time, or time-limited, projects involving the collection of water, soil, sediment, or other
Cryogenic Fluid Management Technology for Moon and Mars Missions
NASA Technical Reports Server (NTRS)
Doherty, Michael P.; Gaby, Joseph D.; Salerno, Louis J.; Sutherlin, Steven G.
2010-01-01
In support of the U.S. Space Exploration Policy, focused cryogenic fluid management technology efforts are underway within the National Aeronautics and Space Administration. Under the auspices of the Exploration Technology Development Program, cryogenic fluid management technology efforts are being conducted by the Cryogenic Fluid Management Project. Cryogenic Fluid Management Project objectives are to develop storage, transfer, and handling technologies for cryogens to support high performance demands of lunar, and ultimately, Mars missions in the application areas of propulsion, surface systems, and Earth-based ground operations. The targeted use of cryogens and cryogenic technologies for these application areas is anticipated to significantly reduce propellant launch mass and required on-orbit margins, to reduce and even eliminate storage tank boil-off losses for long term missions, to economize ground pad storage and transfer operations, and to expand operational and architectural operations at destination. This paper organizes Cryogenic Fluid Management Project technology efforts according to Exploration Architecture target areas, and discusses the scope of trade studies, analytical modeling, and test efforts presently underway, as well as future plans, to address those target areas. The target areas are: liquid methane/liquid oxygen for propelling the Altair Lander Ascent Stage, liquid hydrogen/liquid oxygen for propelling the Altair Lander Descent Stage and Ares V Earth Departure Stage, liquefaction, zero boil-off, and propellant scavenging for Lunar Surface Systems, cold helium and zero boil-off technologies for Earth-Based Ground Operations, and architecture definition studies for long term storage and on-orbit transfer and pressurization of LH2, cryogenic Mars landing and ascent vehicles, and cryogenic production via in situ resource utilization on Mars.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daley, P F
The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water samplingmore » and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection limit estimates and trend plotting were performed with spreadsheets and statistics software. Moreover, the analytical method developed was very limited in compound coverage, and unable to closely mirror the standard analytical methods promulgated by the EPA. To address these deficiencies, during this award the original equipment was operated at the OU 2-GTS to further evaluate the use of columns, commercial standard blends and other components to broaden the compound coverage of the chromatography system. A second-generation ASAP was designed and built to replace the original system at the OU 2-GTS, and include provision for introduction of internal standard compounds and surrogates into each sample analyzed. An enhanced, LabVIEW based chromatogram analysis application was written, that manages and archives chemical standards information, and provides a basis for NIST traceability for all analyses. Within this same package, all compound calibration response curves are managed, and different report formats were incorporated, that simplify trend analysis. Test results focus on operation of the original system at the OU 1 Integrated Chemical and Flow Monitoring System, at the OU 1 Fire Drill Area remediation site.« less
Isotope and Chemical Methods in Support of the U.S. Geological Survey Science Strategy, 2003-2008
Rye, R.O.; Johnson, C.A.; Landis, G.P.; Hofstra, A.H.; Emsbo, P.; Stricker, C.A.; Hunt, A.G.; Rusk, B.G.
2008-01-01
Principal functions of the Mineral Resources Program are providing information to decision-makers related to mineral deposits on federal lands and predicting the environmental consequences of the mining or natural weathering of those deposits. Performing these functions requires that predictions be made of the likelihood of undiscovered deposits. The predictions are based on geologic and geoenvironmental models that are constructed for the various types of mineral deposits from detailed descriptions of actual deposits and detailed understanding of the processes that formed them. Over the past three decades the understanding of ore-forming processes has benefitted greatly from the integration of laboratory-based geochemical tools with field observations and other data sources. Under the aegis of the Evolution of Ore Deposits and Technology Transfer Project (EODTTP), a five-year effort that terminated in 2008, the Mineral Resources Program provided state-of-the-art analytical capabilities to support applications of several related geochemical tools.
ESR modes in a Strong-Leg Ladder in the Tomonaga-Luttinger Liquid Phase
NASA Astrophysics Data System (ADS)
Zvyagin, S.; Ozerov, M.; Maksymenko, M.; Wosnitza, J.; Honecker, A.; Landee, C. P.; Turnbull, M.; Furuya, S. C.; Giamarchi, T.
Magnetic excitations in the strong-leg quantum spin ladder compound (C7H10N)2CuBr4 (known as DIMPY) in the field-induced Tomonaga-Luttinger spin liquid phase are studied by means of high-field electron spin resonance (ESR) spectroscopy. The presence of a gapped ESR mode with unusual non-linear frequency-field dependence is revealed experimentally. Using a combination of analytic and exact diagonalization methods, we compute the dynamical structure factor and identify this mode with longitudinal excitations in the antisymmetric channel. We argue that these excitations constitute a fingerprint of the spin dynamics in a strong-leg spin-1/2 Heisenberg antiferromagnetic ladder and owe its ESR observability to the uniform Dzyaloshinskii-Moriya interaction. This work was partially supported by the DFG and Helmholtz Gemeinschaft (Germany), Swiss SNF under Division II, and ERC synergy UQUAM project. We acknowledge the support of the HLD at HZDR, member of the European Magnetic Field Laboratory (EMFL).
Back analysis of geomechanical parameters in underground engineering using artificial bee colony.
Zhu, Changxing; Zhao, Hongbo; Zhao, Ming
2014-01-01
Accurate geomechanical parameters are critical in tunneling excavation, design, and supporting. In this paper, a displacements back analysis based on artificial bee colony (ABC) algorithm is proposed to identify geomechanical parameters from monitored displacements. ABC was used as global optimal algorithm to search the unknown geomechanical parameters for the problem with analytical solution. To the problem without analytical solution, optimal back analysis is time-consuming, and least square support vector machine (LSSVM) was used to build the relationship between unknown geomechanical parameters and displacement and improve the efficiency of back analysis. The proposed method was applied to a tunnel with analytical solution and a tunnel without analytical solution. The results show the proposed method is feasible.
Visual Analytics for the Food-Water-Energy Nexus in the Phoenix Active Management Area
NASA Astrophysics Data System (ADS)
Maciejewski, R.; Mascaro, G.; White, D. D.; Ruddell, B. L.; Aggarwal, R.; Sarjoughian, H.
2016-12-01
The Phoenix Active Management Area (AMA) is an administrative region of 14,500 km2 identified by the Arizona Department of Water Resources with the aim of reaching and maintaining the safe yield (i.e. balance between annual amount of groundwater withdrawn and recharged) by 2025. The AMA includes the Phoenix metropolitan area, which has experienced a dramatic population growth over the last decades with a progressive conversion of agricultural land into residential land. As a result of these changes, the water and energy demand as well as the food production in the region have significantly evolved over the last 30 years. Given the arid climate, a crucial role to support this growth has been the creation of a complex water supply system based on renewable and non-renewable resources, including the energy-intensive Central Arizona Project. In this talk, we present a preliminary characterization of the evolution in time of the feedbacks between food, water, and energy in the Phoenix AMA by analyzing secondary data (available from water and energy providers, irrigation districts, and municipalities), as well as satellite imagery and primary data collected by the authors. A preliminary visual analytics framework is also discussed describing current design practices and ideas for exploring networked components and cascading impacts within the FEW Nexus. This analysis and framework represent the first steps towards the development of an integrated modeling, visualization, and decision support infrastructure for comprehensive FEW systems decision making at decision-relevant temporal and spatial scales.
Online Learner Engagement: Opportunities and Challenges with Using Data Analytics
ERIC Educational Resources Information Center
Bodily, Robert; Graham, Charles R.; Bush, Michael D.
2017-01-01
This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…
Features Students Really Expect from Learning Analytics
ERIC Educational Resources Information Center
Schumacher, Clara; Ifenthaler, Dirk
2016-01-01
In higher education settings more and more learning is facilitated through online learning environments. To support and understand students' learning processes better, learning analytics offers a promising approach. The purpose of this study was to investigate students' expectations toward features of learning analytics systems. In a first…
CZAEM USER'S GUIDE: MODELING CAPTURE ZONES OF GROUND-WATER WELLS USING ANALYTIC ELEMENTS
The computer program CZAEM is designed for elementary capture zone analysis, and is based on the analytic element method. CZAEM is applicable to confined and/or unconfined low in shallow aquifers; the Dupuit-Forchheimer assumption is adopted. CZAEM supports the following analyt...
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Intersubjectivity and the creation of meaning in the analytic process.
Maier, Christian
2014-11-01
By means of a clinical illustration, the author describes how the intersubjective exchanges involved in an analytic process facilitate the representation of affects and memories which have been buried in the unconscious or indeed have never been available to consciousness. As a result of projective identificatory processes in the analytic relationship, in this example the analyst falls into a situation of helplessness which connects with his own traumatic experiences. Then he gets into a formal regression of the ego and responds with a so-to-speak hallucinatory reaction-an internal image which enables him to keep the analytic process on track and, later on, to construct an early traumatic experience of the analysand. © 2014, The Society of Analytical Psychology.
Technosocial Predictive Analytics in Support of Naturalistic Decision Making
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Cowell, Andrew J.; Malone, Elizabeth L.
2009-06-23
A main challenge we face in fostering sustainable growth is to anticipate outcomes through predictive and proactive across domains as diverse as energy, security, the environment, health and finance in order to maximize opportunities, influence outcomes and counter adversities. The goal of this paper is to present new methods for anticipatory analytical thinking which address this challenge through the development of a multi-perspective approach to predictive modeling as a core to a creative decision making process. This approach is uniquely multidisciplinary in that it strives to create decision advantage through the integration of human and physical models, and leverages knowledgemore » management and visual analytics to support creative thinking by facilitating the achievement of interoperable knowledge inputs and enhancing the user’s cognitive access. We describe a prototype system which implements this approach and exemplify its functionality with reference to a use case in which predictive modeling is paired with analytic gaming to support collaborative decision-making in the domain of agricultural land management.« less
Strategic analytics: towards fully embedding evidence in healthcare decision-making.
Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh
2015-01-01
Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.
Fission Power System Technology for NASA Exploration Missions
NASA Technical Reports Server (NTRS)
Mason, Lee; Houts, Michael
2011-01-01
Under the NASA Exploration Technology Development Program, and in partnership with the Department of Energy (DOE), NASA is conducting a project to mature Fission Power System (FPS) technology. A primary project goal is to develop viable system options to support future NASA mission needs for nuclear power. The main FPS project objectives are as follows: 1) Develop FPS concepts that meet expected NASA mission power requirements at reasonable cost with added benefits over other options. 2) Establish a hardware-based technical foundation for FPS design concepts and reduce overall development risk. 3) Reduce the cost uncertainties for FPS and establish greater credibility for flight system cost estimates. 4) Generate the key products to allow NASA decisionmakers to consider FPS as a preferred option for flight development. In order to achieve these goals, the FPS project has two main thrusts: concept definition and risk reduction. Under concept definition, NASA and DOE are performing trade studies, defining requirements, developing analytical tools, and formulating system concepts. A typical FPS consists of the reactor, shield, power conversion, heat rejection, and power management and distribution (PMAD). Studies are performed to identify the desired design parameters for each subsystem that allow the system to meet the requirements with reasonable cost and development risk. Risk reduction provides the means to evaluate technologies in a laboratory test environment. Non-nuclear hardware prototypes are built and tested to verify performance expectations, gain operating experience, and resolve design uncertainties.
Hinchcliff, Reece; Greenfield, David; Moldovan, Max; Pawsey, Marjorie; Mumford, Virginia; Westbrook, Johanna Irene; Braithwaite, Jeffrey
2012-01-01
Accreditation programmes aim to improve the quality and safety of health services, and have been widely implemented. However, there is conflicting evidence regarding the outcomes of existing programmes. The Accreditation Collaborative for the Conduct of Research, Evaluation and Designated Investigations through Teamwork-Current Accreditation Processes (ACCREDIT-CAP) project is designed to address key gaps in the literature by evaluating the current processes of three accreditation programmes used across Australian acute, primary and aged care services. The project comprises three mixed-method studies involving documentary analyses, surveys, focus groups and individual interviews. Study samples will comprise stakeholders from across the Australian healthcare system: accreditation agencies; federal and state government departments; consumer advocates; professional colleges and associations; and staff of acute, primary and aged care services. Sample sizes have been determined to ensure results allow robust conclusions. Qualitative information will be thematically analysed, supported by the use of textual grouping software. Quantitative data will be subjected to a variety of analytical procedures, including descriptive and comparative statistics. The results are designed to inform health system policy and planning decisions in Australia and internationally. The project has been approved by the University of New South Wales Human Research Ethics Committee (approval number HREC 10274). Results will be reported to partner organisations, healthcare consumers and other stakeholders via peer-reviewed publications, conference and seminar presentations, and a publicly accessible website.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, R.S.; Kong, E.J.; Bahner, M.A.
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kennedy, Robert T.
Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.
Streaming Visual Analytics Workshop Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Burtner, Edwin R.; Kritzstein, Brian P.
How can we best enable users to understand complex emerging events and make appropriate assessments from streaming data? This was the central question addressed at a three-day workshop on streaming visual analytics. This workshop was organized by Pacific Northwest National Laboratory for a government sponsor. It brought together forty researchers and subject matter experts from government, industry, and academia. This report summarizes the outcomes from that workshop. It describes elements of the vision for a streaming visual analytic environment and set of important research directions needed to achieve this vision. Streaming data analysis is in many ways the analysis andmore » understanding of change. However, current visual analytics systems usually focus on static data collections, meaning that dynamically changing conditions are not appropriately addressed. The envisioned mixed-initiative streaming visual analytics environment creates a collaboration between the analyst and the system to support the analysis process. It raises the level of discourse from low-level data records to higher-level concepts. The system supports the analyst’s rapid orientation and reorientation as situations change. It provides an environment to support the analyst’s critical thinking. It infers tasks and interests based on the analyst’s interactions. The system works as both an assistant and a devil’s advocate, finding relevant data and alerts as well as considering alternative hypotheses. Finally, the system supports sharing of findings with others. Making such an environment a reality requires research in several areas. The workshop discussions focused on four broad areas: support for critical thinking, visual representation of change, mixed-initiative analysis, and the use of narratives for analysis and communication.« less
ERIC Educational Resources Information Center
Scott, Patrick B., Ed.
1991-01-01
REDUC is a cooperative network of some 23 associated centers in 17 Latin American and Caribbean countries. The REDUC coordinating center is located in Santiago, Chile. REDUC produces a bibliographic database containing analytical summaries (approximately 800 items annually) of the most important research studies and project descriptions in the…
ERIC Educational Resources Information Center
Rouchouse, Marine; Faysse, Nicolas; De Romemont, Aurelle; Moumouni, Ismail; Faure, Guy
2015-01-01
Purpose: Approaches to build farmers' analytical capacities are said to trigger wide-ranging changes. This article reports on the communication process between participants and non-participants in one such approach, related to the technical and management skills learned by participants and the changes these participants subsequently made, and the…
Discourse-Centric Learning Analytics: Mapping the Terrain
ERIC Educational Resources Information Center
Knight, Simon; Littleton, Karen
2015-01-01
There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…
Technology Enhanced Analytics (TEA) in Higher Education
ERIC Educational Resources Information Center
Daniel, Ben Kei; Butson, Russell
2013-01-01
This paper examines the role of Big Data Analytics in addressing contemporary challenges associated with current changes in institutions of higher education. The paper first explores the potential of Big Data Analytics to support instructors, students and policy analysts to make better evidence based decisions. Secondly, the paper presents an…
Investigation of Using Analytics in Promoting Mobile Learning Support
ERIC Educational Resources Information Center
Visali, Videhi; Swami, Niraj
2013-01-01
Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…
A Lecture Supporting System Based on Real-Time Learning Analytics
ERIC Educational Resources Information Center
Shimada, Atsushi; Konomi, Shin'ichi
2017-01-01
A new lecture supporting system based on real-time learning analytics is proposed. Our target is on-site classrooms where teachers give their lectures, and a lot of students listen to teachers' explanation, conduct exercises etc. We utilize not only an e-Learning system, but also an e-Book system to collect real-time learning activities during the…
1991-09-01
iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria
Kawamoto, Kensaku; Martin, Cary J; Williams, Kip; Tu, Ming-Chieh; Park, Charlton G; Hunter, Cheri; Staes, Catherine J; Bray, Bruce E; Deshmukh, Vikrant G; Holbrook, Reid A; Morris, Scott J; Fedderson, Matthew B; Sletta, Amy; Turnbull, James; Mulvihill, Sean J; Crabtree, Gordon L; Entwistle, David E; McKenna, Quinn L; Strong, Michael B; Pendleton, Robert C; Lee, Vivian S
2015-01-01
Objective To develop expeditiously a pragmatic, modular, and extensible software framework for understanding and improving healthcare value (costs relative to outcomes). Materials and methods In 2012, a multidisciplinary team was assembled by the leadership of the University of Utah Health Sciences Center and charged with rapidly developing a pragmatic and actionable analytics framework for understanding and enhancing healthcare value. Based on an analysis of relevant prior work, a value analytics framework known as Value Driven Outcomes (VDO) was developed using an agile methodology. Evaluation consisted of measurement against project objectives, including implementation timeliness, system performance, completeness, accuracy, extensibility, adoption, satisfaction, and the ability to support value improvement. Results A modular, extensible framework was developed to allocate clinical care costs to individual patient encounters. For example, labor costs in a hospital unit are allocated to patients based on the hours they spent in the unit; actual medication acquisition costs are allocated to patients based on utilization; and radiology costs are allocated based on the minutes required for study performance. Relevant process and outcome measures are also available. A visualization layer facilitates the identification of value improvement opportunities, such as high-volume, high-cost case types with high variability in costs across providers. Initial implementation was completed within 6 months, and all project objectives were fulfilled. The framework has been improved iteratively and is now a foundational tool for delivering high-value care. Conclusions The framework described can be expeditiously implemented to provide a pragmatic, modular, and extensible approach to understanding and improving healthcare value. PMID:25324556
NASA Astrophysics Data System (ADS)
Viñas, Pilar; Navarro, Tania; Campillo, Natalia; Fenoll, Jose; Garrido, Isabel; Cava, Juana; Hernandez-Cordoba, Manuel
2017-04-01
Microextraction techniques allow sensitive measurements of pollutants to be carried out by means of instrumentation commonly available at the analytical laboratory. This communication reports our studies focused to the determination of pyrethroid insecticides in polluted soils. These chemicals are synthetic analogues of pyrethrum widely used for pest control in agricultural and household applications. Because of their properties, pyrethroids tend to strongly absorb to soil particles and organic matter. Although they are considered as pesticides with a low toxicity for humans, long times exposure to them may cause damage in immune system and in the neurological system. The procedure here studied is based on dispersive liquid-liquid microextraction (DLLME), and permits the determination of fifteen pyrethroid compounds (allethrin, resmethrin, tetramethrin, bifenthrin, fenpropathrin, cyhalothrin, acrinathrin, permethrin, λ-cyfluthrin, cypermethrin, flucythrinate, fenvalerate, esfenvalerate, τ-fluvalinate, and deltamethrin) in soil samples using gas chromatography with mass spectrometry (GC-MS). The analytes were first extracted from the soil samples (4 g) by treatment with 2 mL of acetonitrile, 2 mL of water and 0.5 g of NaCl. The enriched organic phase (approximately 0.8 mL) was separated by centrifugation, and this solution used as the dispersant in a DLLME process. The analytes did not need to be derivatized before their injection into the chromatographic system, due to their volatility and thermal stability. The identification of the different pyrethroids was carried out based on their retention times and mass spectra, considering the m/z values of the different fragments and their relative abundances. The detection limits were in the 0.2-23 ng g-1 range, depending on the analyte and the sample under analysis. The authors are grateful to the Comunidad Autonóma de la Región de Murcia, Spain (Fundación Séneca, 19888/GERM/15) and to the Spanish MINECO (Project CTQ2015-68049-R) for financial support
Curating and Integrating Data from Multiple Sources to Support Healthcare Analytics.
Ng, Kenney; Kakkanatt, Chris; Benigno, Michael; Thompson, Clay; Jackson, Margaret; Cahan, Amos; Zhu, Xinxin; Zhang, Ping; Huang, Paul
2015-01-01
As the volume and variety of healthcare related data continues to grow, the analysis and use of this data will increasingly depend on the ability to appropriately collect, curate and integrate disparate data from many different sources. We describe our approach to and highlight our experiences with the development of a robust data collection, curation and integration infrastructure that supports healthcare analytics. This system has been successfully applied to the processing of a variety of data types including clinical data from electronic health records and observational studies, genomic data, microbiomic data, self-reported data from surveys and self-tracked data from wearable devices from over 600 subjects. The curated data is currently being used to support healthcare analytic applications such as data visualization, patient stratification and predictive modeling.
Instructional Implications of Inquiry in Reading Comprehension.
ERIC Educational Resources Information Center
Snow, David
A contract deliverable on the NIE Communication Skills Project, this report consists of three separate documents describing the instructional implications of the analytic and empirical work carried out for the "Classroom Instruction in Reading Comprehension" part of the project: (1) Guidelines for Phrasal Segmentation; (2) Parsing Tasks…
Philosophy Pursued through Empirical Research: Introduction to the Special Issue
ERIC Educational Resources Information Center
Wilson, Terri S.; Santoro, Doris A.
2015-01-01
Many scholars have pursued philosophical inquiry through empirical research. These empirical projects have been shaped--to varying degrees and in different ways--by philosophical questions, traditions, frameworks and analytic approaches. This issue explores the methodological challenges and opportunities involved in these kinds of projects. In…
efficiency and renewable energy projects. His patent on the Renewable Energy Optimization (REO) method of distribution function for time-series simulation Analytical and numerical optimization Project delivery with System Operations and Maintenance: 2nd Edition, 2016, NREL/Sandia/Sunspec Alliance SuNLaMP PV O&M
Granitto, Matthew; Bailey, Elizabeth A.; Schmidt, Jeanine M.; Shew, Nora B.; Gamble, Bruce M.; Labay, Keith A.
2011-01-01
The Alaska Geochemical Database (AGDB) was created and designed to compile and integrate geochemical data from Alaska in order to facilitate geologic mapping, petrologic studies, mineral resource assessments, definition of geochemical baseline values and statistics, environmental impact assessments, and studies in medical geology. This Microsoft Access database serves as a data archive in support of present and future Alaskan geologic and geochemical projects, and contains data tables describing historical and new quantitative and qualitative geochemical analyses. The analytical results were determined by 85 laboratory and field analytical methods on 264,095 rock, sediment, soil, mineral and heavy-mineral concentrate samples. Most samples were collected by U.S. Geological Survey (USGS) personnel and analyzed in USGS laboratories or, under contracts, in commercial analytical laboratories. These data represent analyses of samples collected as part of various USGS programs and projects from 1962 to 2009. In addition, mineralogical data from 18,138 nonmagnetic heavy mineral concentrate samples are included in this database. The AGDB includes historical geochemical data originally archived in the USGS Rock Analysis Storage System (RASS) database, used from the mid-1960s through the late 1980s and the USGS PLUTO database used from the mid-1970s through the mid-1990s. All of these data are currently maintained in the Oracle-based National Geochemical Database (NGDB). Retrievals from the NGDB were used to generate most of the AGDB data set. These data were checked for accuracy regarding sample location, sample media type, and analytical methods used. This arduous process of reviewing, verifying and, where necessary, editing all USGS geochemical data resulted in a significantly improved Alaska geochemical dataset. USGS data that were not previously in the NGDB because the data predate the earliest USGS geochemical databases, or were once excluded for programmatic reasons, are included here in the AGDB and will be added to the NGDB. The AGDB data provided here are the most accurate and complete to date, and should be useful for a wide variety of geochemical studies. The AGDB data provided in the linked database may be updated or changed periodically. The data on the DVD and in the data downloads provided with this report are current as of date of publication.
Package-X 2.0: A Mathematica package for the analytic calculation of one-loop integrals
NASA Astrophysics Data System (ADS)
Patel, Hiren H.
2017-09-01
This article summarizes new features and enhancements of the first major update of Package-X. Package-X 2.0 can now generate analytic expressions for arbitrarily high rank dimensionally regulated tensor integrals with up to four distinct propagators, each with arbitrary integer weight, near an arbitrary even number of spacetime dimensions, giving UV divergent, IR divergent, and finite parts at (almost) any real-valued kinematic point. Additionally, it can generate multivariable Taylor series expansions of these integrals around any non-singular kinematic point to arbitrary order. All special functions and abbreviations output by Package-X 2.0 support Mathematica's arbitrary precision evaluation capabilities to deal with issues of numerical stability. Finally, tensor algebraic routines of Package-X have been polished and extended to support open fermion chains both on and off shell. The documentation (equivalent to over 100 printed pages) is accessed through Mathematica's Wolfram Documentation Center and contains information on all Package-X symbols, with over 300 basic usage examples, 3 project-scale tutorials, and instructions on linking to FEYNCALC and LOOPTOOLS. Program files doi:http://dx.doi.org/10.17632/yfkwrd4d5t.1 Licensing provisions: CC by 4.0 Programming language: Mathematica (Wolfram Language) Journal reference of previous version: H. H. Patel, Comput. Phys. Commun 197, 276 (2015) Does the new version supersede the previous version?: Yes Summary of revisions: Extension to four point one-loop integrals with higher powers of denominator factors, separate extraction of UV and IR divergent parts, testing for power IR divergences, construction of Taylor series expansions of one-loop integrals, numerical evaluation with arbitrary precision arithmetic, manipulation of fermion chains, improved tensor algebraic routines, and much expanded documentation. Nature of problem: Analytic calculation of one-loop integrals in relativistic quantum field theory. Solution method: Passarino-Veltman reduction formula, Denner-Dittmaier reduction formulae, and additional algorithms described in the manuscript. Restrictions: One-loop integrals are limited to those involving no more than four denominator factors.
ERIC Educational Resources Information Center
Cooper, Bruce S.
The purpose of this study is threefold: to recount the history of the Anacostia Community School Project (later renamed the Response to Educational Needs Project) in Washington, D.C. between 1967 and 1978; to analyze the events of the period in light of theories of historiographic and social scientific developments; and to provide lessons from the…
Roger D. Fight; R. James Barbour; Glenn Christensen; Guy L. Pinjuv; Rao V. Nagubadi
2004-01-01
This work was undertaken under a joint fire science project "Assessing the need, costs, and potential benefits of prescribed fire and mechanical treatments to reduce fire hazard." This paper compares the future mix of timber projects under two treatment scenarios for New Mexico.We developed and demonstrated an analytical method that uses readily available...
222-S Laboratory Quality Assurance Plan. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meznarich, H.K.
1995-07-31
This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less
Unique Education and Workforce Development for NASA Engineers
NASA Technical Reports Server (NTRS)
Forsgren, Roger C.; Miller, Lauren L.
2010-01-01
NASA engineers are some of the world's best-educated graduates, responsible for technically complex, highly significant scientific programs. Even though these professionals are highly proficient in traditional analytical competencies, there is a unique opportunity to offer continuing education that further enhances their overall scientific minds. With a goal of maintaining the Agency's passionate, "best in class" engineering workforce, the NASA Academy of Program/Project & Engineering Leadership (APPEL) provides educational resources encouraging foundational learning, professional development, and knowledge sharing. NASA APPEL is currently partnering with the scientific community's most respected subject matter experts to expand its engineering curriculum beyond the analytics and specialized subsystems in the areas of: understanding NASA's overall vision and its fundamental basis, and the Agency initiatives supporting them; sharing NASA's vast reservoir of engineering experience, wisdom, and lessons learned; and innovatively designing hardware for manufacturability, assembly, and servicing. It takes collaboration and innovation to educate an organization that possesses such a rich and important historyand a future that is of great global interest. NASA APPEL strives to intellectually nurture the Agency's technical professionals, build its capacity for future performance, and exemplify its core valuesalJ to better enable NASA to meet its strategic visionand beyond.
NASA Astrophysics Data System (ADS)
Kerlin, Steven C.; Carlsen, William S.; Kelly, Gregory J.; Goehring, Elizabeth
2013-08-01
The conception of Global Learning Communities (GLCs) was researched to discover potential benefits of the use of online technologies that facilitated communication and scientific data sharing outside of the normal classroom setting. 1,419 students in 635 student groups began the instructional unit. Students represented the classrooms of 33 teachers from the USA, 6 from Thailand, 7 from Australia, and 4 from Germany. Data from an international environmental education project were analyzed to describe grades 7-9 student scientific writing in domestic US versus international-US classroom online partnerships. The development of an argument analytic and a research model of exploratory data analysis followed by statistical testing were used to discover and highlight different ways students used evidence to support their scientific claims about temperature variation at school sites and deep-sea hydrothermal vents. Findings show modest gains in the use of some evidentiary discourse components by US students in international online class partnerships compared to their US counterparts in domestic US partnerships. The analytic, research model, and online collaborative learning tools may be used in other large-scale studies and learning communities. Results provide insights about the benefits of using online technologies and promote the establishment of GLCs.
Liquid Oxygen/Liquid Methane Integrated Propulsion System Test Bed
NASA Technical Reports Server (NTRS)
Flynn, Howard; Lusby, Brian; Villemarette, Mark
2011-01-01
In support of NASA?s Propulsion and Cryogenic Advanced Development (PCAD) project, a liquid oxygen (LO2)/liquid methane (LCH4) Integrated Propulsion System Test Bed (IPSTB) was designed and advanced to the Critical Design Review (CDR) stage at the Johnson Space Center. The IPSTB?s primary objectives are to study LO2/LCH4 propulsion system steady state and transient performance, operational characteristics and to validate fluid and thermal models of a LO2/LCH4 propulsion system for use in future flight design work. Two phase thermal and dynamic fluid flow models of the IPSTB were built to predict the system performance characteristics under a variety of operating modes and to aid in the overall system design work. While at ambient temperature and simulated altitude conditions at the White Sands Test Facility, the IPSTB and its approximately 600 channels of system instrumentation would be operated to perform a variety of integrated main engine and reaction control engine hot fire tests. The pressure, temperature, and flow rate data collected during this testing would then be used to validate the analytical models of the IPSTB?s thermal and dynamic fluid flow performance. An overview of the IPSTB design and analytical model development will be presented.
Effects of Coulomb Coupling on the Stopping Power of Plasmas
NASA Astrophysics Data System (ADS)
Bernstein, David; Daligault, Jerome; Baalrud, Scott
2017-10-01
Stopping power of charged particles in plasma is important for a detailed understanding of particle and energy transport in plasmas, such as those found in fusion applications. Although stopping power is rather well understood for weakly coupled plasmas, this is less the case for strongly coupled plasmas. In order to shed light on the effects of strong Coulomb coupling, we have conducted detailed molecular dynamics simulations of the stopping power of a One-Component Plasma (OCP) across a wide range of conditions. The OCP allows first-principle computations that are not possible with more complex models, enabling rigorous tests of analytical theories. The molecular dynamics simulations were compared to two analytical theories that attempt to extend traditional weakly-coupled theories into the strong coupling regime. The first is based on the binary approximation, which accounts for strong coupling via an effective scattering cross section derived from the effective potential theory. The second is based on the dielectric function formulation with the inclusion of a local field corrections. Work supported by LANL LDRD project 20150520ER and ir Force Office of Scientific Research under Award Number FA9550-16-1-0221.