Science.gov

Sample records for analytical tool research

  1. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  2. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  3. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  4. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  5. PRE-QAPP AGREEMENT (PQA) AND ANALYTICAL METHOD CHECKLISTS (AMCS): TOOLS FOR PLANNING RESEARCH PROJECTS

    EPA Science Inventory

    The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...

  6. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  7. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  8. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  9. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  10. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  11. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  12. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  13. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  14. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1999

    This document contains four symposium papers on measurement and research tools. "Income Effects of Human Resource Development for Higher Educated Professionals" (Martin Mulder, Bob Witziers) reports on a study of 1,876 higher-educated professionals that found no correlation between participation in human resource development activities and…

  15. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by the…

  16. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  17. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  18. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  19. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  20. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its twenty-fourth month of development activities.

  1. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2004-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eighteenth month of development activities.

  2. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  3. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  5. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  6. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  7. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  8. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  9. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  10. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA)

    EPA Science Inventory

    The Landscape Ecology Branch in cooperation with U.S. EPA Region 4 and TVA have developed a user friendly interface to facilitate with Geographic Information Systems (GIS). GIS have become a powerful tool in the field of landscape ecology. A common application of GIS is the gener...

  11. Tool for Ranking Research Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott, Kelly; Smith, Harold

    2005-01-01

    Tool for Research Enhancement Decision Support (TREDS) is a computer program developed to assist managers in ranking options for research aboard the International Space Station (ISS). It could likely also be adapted to perform similar decision-support functions in industrial and academic settings. TREDS provides a ranking of the options, based on a quantifiable assessment of all the relevant programmatic decision factors of benefit, cost, and risk. The computation of the benefit for each option is based on a figure of merit (FOM) for ISS research capacity that incorporates both quantitative and qualitative inputs. Qualitative inputs are gathered and partly quantified by use of the time-tested analytical hierarchical process and used to set weighting factors in the FOM corresponding to priorities determined by the cognizant decision maker(s). Then by use of algorithms developed specifically for this application, TREDS adjusts the projected benefit for each option on the basis of levels of technical implementation, cost, and schedule risk. Based partly on Excel spreadsheets, TREDS provides screens for entering cost, benefit, and risk information. Drop-down boxes are provided for entry of qualitative information. TREDS produces graphical output in multiple formats that can be tailored by users.

  12. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  13. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  14. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  15. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  16. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  17. Analytical Modelling Of Milling For Tool Design And Selection

    SciTech Connect

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-17

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  18. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  19. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  20. Visual-Analytics Tools for Analyzing Polymer Conformational Dynamics

    NASA Astrophysics Data System (ADS)

    Thakur, Sidharth; Tallury, Syamal; Pasquinelli, Melissa

    2010-03-01

    The goal of this work is to supplement existing methods for analyzing spatial-temporal dynamics of polymer conformations derived from molecular dynamics simulations by adapting standard visual-analytics tools. We intend to use these tools to quantify conformational dynamics and chemical characteristics at interfacial domains, and correlate this information to the macroscopic properties of a material. Our approach employs numerical measures of similarities and provides matrix- and graph-based representations of the similarity relationships for the polymer structures. We will discuss some numerical measures that encapsulate geometric and spatial attributes of polymer molecular configurations. These methods supply information on global and local relationships between polymer conformations, which can be used to inspect important characteristics of stable and persistent polymer conformations in specific environments. Initially, we have applied these tools to investigate the interface in polymer nanocomposites between a polymer matrix and carbon nanotube reinforcements and to correlate this information to the macroscopic properties of the material. The results indicate that our visual-analytic approach can be used to compare spatial dynamics of rigid and non-rigid polymers and properties of families of related polymers.

  1. Geographical Information Systems: A Tool for Institutional Research.

    ERIC Educational Resources Information Center

    Prather, James E.; Carlson, Christina E.

    This paper addresses the application of Geographical Information Systems (GIS), a computerized tool for associating key information by geographical location, to the institutional research function at institutions of higher education. The first section investigates the potential of GIS as an analytical and planning tool for institutional…

  2. A Tool for Medical Research

    NASA Technical Reports Server (NTRS)

    1992-01-01

    California Measurements, Inc.'s PC-2 Aerosol Particle Analyzer, developed by William Chiang, a former Jet Propulsion Laboratory (JPL) engineer, was used in a study to measure the size of particles in the medical environment. Chiang has a NASA license for the JPL crystal oscillator technology and originally built the instrument for atmospheric research. In the operating room, it enabled researchers from the University of California to obtain multiple sets of data repeatedly and accurately. The study concluded that significant amounts of aerosols are generated during surgery when power tools are employed, and most of these are in the respirable size. Almost all contain blood and are small enough to pass through surgical masks. Research on the presence of blood aerosols during oral surgery had similar results. Further studies are planned to determine the possibility of HIV transmission during surgery, and the PC-2H will be used to quantify blood aerosols.

  3. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  4. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  5. Ultrafast 2D NMR: An Emerging Tool in Analytical Spectroscopy

    NASA Astrophysics Data System (ADS)

    Giraudeau, Patrick; Frydman, Lucio

    2014-06-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry—from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications.

  6. Experimental and analytical ion thruster research

    NASA Technical Reports Server (NTRS)

    Ruyten, Wilhelmus M.; Friedly, V. J.; Peng, Xiaohang; Keefer, Dennis

    1993-01-01

    The results of further spectroscopic studies on the plume from a 3 cm ion source operated on an argon propellant is reported on. In particular, it is shown that it should be possible to use the spectroscopic technique to measure the plasma density of the ion plume close to the grids, where it is difficult to use electrical probe measurements. How the technique, along with electrical probe measurements in the far downstream region of the plume, can be used to characterize the operation of a three-grid, 15 cm diameter thruster from NASA JPL is outlined. Pumping speed measurements on the Vacuum Research Facility have shown that this facility should be adequate for testing the JPL thruster at pressures in the low 10(exp -5) Torr range. Finally, we describe a simple analytical model which can be used to calculate the grid impingement current which results from charge-exchange collisions in the ion plume.

  7. Network Analytical Tool for Monitoring Global Food Safety Highlights China

    PubMed Central

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P.

    2009-01-01

    Background The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. Methodology/Principal Findings We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003 – August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. Conclusions/Significance This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios. PMID:19688088

  8. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  9. Scalable Combinatorial Tools for Health Disparities Research

    PubMed Central

    Langston, Michael A.; Levine, Robert S.; Kilbourne, Barbara J.; Rogers, Gary L.; Kershenbaum, Anne D.; Baktash, Suzanne H.; Coughlin, Steven S.; Saxton, Arnold M.; Agboto, Vincent K.; Hood, Darryl B.; Litchveld, Maureen Y.; Oyana, Tonny J.; Matthews-Juarez, Patricia; Juarez, Paul D.

    2014-01-01

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject. PMID:25310540

  10. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  11. MHK Research, Tools, and Methods

    SciTech Connect

    Jepsen, Richard

    2011-11-02

    Presentation from the 2011 Water Peer Review in which principal investigator discusses improved testing, analysis, and design tools needed to more accurately model operational conditions, to optimize design parameters, and predict technology viability.

  12. MATRIICES - Mass Analytical Tool for Reactions in Interstellar ICES

    NASA Astrophysics Data System (ADS)

    Isokoski, K.; Bossa, J. B.; Linnartz, H.

    2011-05-01

    The formation of complex organic molecules (COMs) observed in the inter- and circumstellar medium (ISCM) is driven by a complex chemical network yet to be fully characterized. Interstellar dust grains and the surrounding ice mantles, subject to atom bombardment, UV irradiation, and thermal processing, are believed to provide catalytic sites for such chemistry. However, the solid state chemical processes and the level of complexity reachable under astronomical conditions remain poorly understood. The conventional laboratory techniques used to characterize the solid state reaction pathways - RAIRS (Reflection Absorption IR Spectroscopy) and TPD (Temperature-Programmed Desorption) - are suitable for the analysis of reactions in ices made of relatively small molecules. For more complex ices comprising a series of different components as relevant to the interstellar medium, spectral overlapping prohibits unambiguous identification of reaction schemes, and these techniques start to fail. Therefore, we have constructed a new and innovative experimental set up for the study of complex interstellar ices featuring a highly sensitive and unambiguous detection method. MATRIICES (Mass Analytical Tool for Reactions in Interstellar ICES) combines Laser Ablation technique with a molecular beam experiment and Time-Of-Flight Mass Spectrometry (LA-TOF-MS) to sample and analyze the ice analogues in situ, at native temperatures, under clean ultra-high vacuum conditions. The method allows direct sampling and analysis of the ice constituents in real time, by using a pulsed UV ablation laser (355-nm Nd:YAG) to vaporize the products in a MALDI-TOF like detection scheme. The ablated material is caught in a synchronously pulsed molecular beam of inert carrier gas (He) from a supersonic valve, and analysed in a Reflectron Time-of-Flight Mass Spectrometer. The detection limit of the method is expected to exceed that of the regular surface techniques substantially. The ultimate goal is to fully

  13. Observatory Bibliographies as Research Tools

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, S. L.

    2013-01-01

    Traditionally, observatory bibliographies were maintained to provide insight in how successful a observatory is as measured by its prominence in the (refereed) literature. When we set up the bibliographic database for the Chandra X-ray Observatory (http://cxc.harvard.edu/cgi-gen/cda/bibliography) as part of the Chandra Data Archive ((http://cxc.harvard.edu/cda/), very early in the mission, our objective was to make it primarily a useful tool for our user community. To achieve this we are: (1) casting a very wide net in collecting Chandra-related publications; (2) including for each literature reference in the database a wealth of metadata that is useful for the users; and (3) providing specific links between the articles and the datasets in the archive that they use. As a result our users are able to browse the literature and the data archive simultaneously. As an added bonus, the rich metadata content and data links have also allowed us to assemble more meaningful statistics about the scientific efficacy of the observatory. In all this we collaborate closely with the Astrophysics Data System (ADS). Among the plans for future enhancement are the inclusion of press releases and the Chandra image gallery, linking with ADS semantic searching tools, full-text metadata mining, and linking with other observatories' bibliographies. This work is supported by NASA contract NAS8-03060 (CXC) and depends critically on the services provided by the ADS.

  14. Individual Development and Latent Groups: Analytical Tools for Interpreting Heterogeneity

    ERIC Educational Resources Information Center

    Thomas, H.; Dahlin, M.P.

    2005-01-01

    Individual differences in development or growth are typically handled under conventional analytical approaches by blocking on the variables thought to contribute to variation, such as sex or age. But such approaches fail when the differences are attributable to latent characteristics (i.e., variables not directly observable beforehand) within the…

  15. Analytical tools for characterizing biopharmaceuticals and the implications for biosimilars

    PubMed Central

    Berkowitz, Steven A.; Engen, John R.; Mazzeo, Jeffrey R.; Jones, Graham B.

    2013-01-01

    Biologics such as monoclonal antibodies are much more complex than small-molecule drugs, which raises challenging questions for the development and regulatory evaluation of follow-on versions of such biopharmaceutical products (also known as biosimilars) and their clinical use once patent protection for the pioneering biologic has expired. With the recent introduction of regulatory pathways for follow-on versions of complex biologics, the role of analytical technologies in comparing biosimilars with the corresponding reference product is attracting substantial interest in establishing the development requirements for biosimilars. Here, we discuss the current state of the art in analytical technologies to assess three characteristics of protein biopharmaceuticals that regulatory authorities have identified as being important in development strategies for biosimilars: post-translational modifications, three-dimensional structures and protein aggregation. PMID:22743980

  16. Research as an educational tool

    SciTech Connect

    Neff, R.; Perlmutter, D.; Klaczynski, P.

    1994-12-31

    Our students have participated in original group research projects focused on the natural environment which culminate in a written manuscript published in-house, and an oral presentation to peers, faculty, and the university community. Our goal has been to develop their critical thinking skills so that they will be more successful in high school and college. We have served ninety-three students (47.1% white, 44.1% black, 5.4% hispanic, 2.2% American Indian, 1.2% asian) from an eight state region in the southeast over the past three years. Thirty-one students have graduated from high school with over 70% enrolled in college and another thirty-four are seniors this year. We are tracking students` progress in college and are developing our own critical thinking test to measure the impact of our program. Although preliminary, the results from the critical thinking test indicated that students are often prone to logical errors; however, higher levels of critical thinking were observed on items which raised issues that conflicted with students` pre-existing beliefs.

  17. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  18. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  19. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...

  20. Analytic hierarchy process (AHP) as a tool in asset allocation

    NASA Astrophysics Data System (ADS)

    Zainol Abidin, Siti Nazifah; Mohd Jaffar, Maheran

    2013-04-01

    Allocation capital investment into different assets is the best way to balance the risk and reward. This can prevent from losing big amount of money. Thus, the aim of this paper is to help investors in making wise investment decision in asset allocation. This paper proposes modifying and adapting Analytic Hierarchy Process (AHP) model. The AHP model is widely used in various fields of study that are related in decision making. The results of the case studies show that the proposed model can categorize stocks and determine the portion of capital investment. Hence, it can assist investors in decision making process and reduce the risk of loss in stock market investment.

  1. A collaborative visual analytics suite for protein folding research.

    PubMed

    Harvey, William; Park, In-Hee; Rübel, Oliver; Pascucci, Valerio; Bremer, Peer-Timo; Li, Chenglong; Wang, Yusu

    2014-09-01

    Molecular dynamics (MD) simulation is a crucial tool for understanding principles behind important biochemical processes such as protein folding and molecular interaction. With the rapidly increasing power of modern computers, large-scale MD simulation experiments can be performed regularly, generating huge amounts of MD data. An important question is how to analyze and interpret such massive and complex data. One of the (many) challenges involved in analyzing MD simulation data computationally is the high-dimensionality of such data. Given a massive collection of molecular conformations, researchers typically need to rely on their expertise and prior domain knowledge in order to retrieve certain conformations of interest. It is not easy to make and test hypotheses as the data set as a whole is somewhat "invisible" due to its high dimensionality. In other words, it is hard to directly access and examine individual conformations from a sea of molecular structures, and to further explore the entire data set. There is also no easy and convenient way to obtain a global view of the data or its various modalities of biochemical information. To this end, we present an interactive, collaborative visual analytics tool for exploring massive, high-dimensional molecular dynamics simulation data sets. The most important utility of our tool is to provide a platform where researchers can easily and effectively navigate through the otherwise "invisible" simulation data sets, exploring and examining molecular conformations both as a whole and at individual levels. The visualization is based on the concept of a topological landscape, which is a 2D terrain metaphor preserving certain topological and geometric properties of the high dimensional protein energy landscape. In addition to facilitating easy exploration of conformations, this 2D terrain metaphor also provides a platform where researchers can visualize and analyze various properties (such as contact density) overlayed on the

  2. Using decision analytic methods to assess the utility of family history tools.

    PubMed

    Tyagi, Anupam; Morris, Jill

    2003-02-01

    Family history may be a useful tool for identifying people at increased risk of disease and for developing targeted interventions for individuals at higher-than-average risk. This article addresses the issue of how to examine the utility of a family history tool for public health and preventive medicine. We propose the use of a decision analytic framework for the assessment of a family history tool and outline the major elements of a decision analytic approach, including analytic perspective, costs, outcome measurements, and data needed to assess the value of a family history tool. We describe the use of sensitivity analysis to address uncertainty in parameter values and imperfect information. To illustrate the use of decision analytic methods to assess the value of family history, we present an example analysis based on using family history of colorectal cancer to improve rates of colorectal cancer screening. PMID:12568827

  3. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  4. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  5. Bringing Research Tools into the Classroom

    ERIC Educational Resources Information Center

    Shubert, Charles; Ceraj, Ivica; Riley, Justin

    2009-01-01

    The advancement of computer technology used for research is creating the need to change the way classes are taught in higher education. "Bringing Research Tools into the Classroom" has become a major focus of the work of the Office of Educational Innovation and Technology (OEIT) for the Dean of Undergraduate Education (DUE) at the Massachusetts…

  6. Quality management system for application of the analytical quality assurance cycle in a research project

    NASA Astrophysics Data System (ADS)

    Camargo, R. S.; Olivares, I. R. B.

    2016-07-01

    The lack of quality assurance and quality control in academic activities have been recognized by the inability to demonstrate reproducibility. This paper aim to apply a quality tool called Analytical Quality Assurance Cycle on a specific research project, supported by a Verification Programme of equipment and an adapted Quality Management System based on international standards, to provide traceability to the data generated.

  7. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  8. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  9. Electron Microscopy: an Analytical Tool for Solid State Physicists

    NASA Astrophysics Data System (ADS)

    van Tendeloo, Gustaaf

    2013-03-01

    For too long the electron microscope has been considered as ``a big magnifying glass.'' Modern electron microscopy however has evolved into an analytical technique, able to provide quantitative data on structure, composition, chemical bonding and magnetic properties. Using lens corrected instruments it is now possible to determine atom shifts at interfaces with a precision of a few picometer; chemical diffusion at these interfaces can be imaged down to atomic scale. The chemical nature of the surface atoms can be visualized and even the bonding state of the elements (e.g. Mn2+ versus Mn3+) can be detected on an atomic scale. Electron microscopy is by principle a projection technique, but the final dream is to obtain atomic info of materials in three dimensions. We will show that this is no longer a dream, but that it is possible using advanced microscopy. We will show evidence of determining the valence change Ce4+ versus Ce3+ at the surface of a CeO2 nanocrystal; the atomic shifts at the interface between LaAlO3 and SrTiO3 and the 3D relaxation of a Au nanocrystal.

  10. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  11. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  12. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  13. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory

  14. More Analytical Tools for Fluids Management in Space

    NASA Astrophysics Data System (ADS)

    Weislogel, Mark

    Continued advances during the 2000-2010 decade in the analysis of a class of capillary-driven flows relevant to materials processing and fluids management aboard spacecraft have been made. The class of flows addressed concern combined forced and spontaneous capillary flows in complex containers with interior edges. Such flows are commonplace in space-based fluid systems and arise from the particular container geometry and wetting properties of the system. Important applications for this work include low-g liquid fill and/or purge operations and passive fluid phase separation operations, where the container (i.e. fuel tank, water processer, etc.) geometry possesses interior edges, and where quantitative information of fluid location, transients, flow rates, and stability is critical. Examples include the storage and handling of liquid propellants and cryogens, water conditioning for life support, fluid phase-change thermal systems, materials processing in the liquid state, on-orbit biofluids processing, among others. For a growing number of important problems, closed-form expressions to transient three-dimensional flows are possible that, as design tools, replace difficult, time-consuming, and rarely performed numerical calculations. An overview of a selection of solutions in-hand is presented with example problems solved. NASA drop tower, low-g aircraft, and ISS flight ex-periment results are employed where practical to buttress the theoretical findings. The current review builds on a similar review presented at COSPAR, 2002, for the approximate decade 1990-2000.

  15. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects. PMID:18946980

  16. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  17. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    SciTech Connect

    Parkerton, T.F.; Stone, M.A.

    1995-12-31

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications.

  18. New Software Framework to Share Research Tools

    NASA Astrophysics Data System (ADS)

    Milner, Kevin; Becker, Thorsten W.; Boschi, Lapo; Sain, Jared; Schorlemmer, Danijel; Waterhouse, Hannah

    2009-03-01

    Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. The software provides a stand-alone open-source package that allows users to operate in a “black box” mode, which hides implementation details, while also allowing them to dig deeper into the underlying source code. The overlying user interfaces are written in the Python programming language using a modern, object-oriented design, including graphical user interactions. SEATREE, which provides an interface to a range of new and existing lower level programs that can be written in any computer programming language, may in the long run contribute to new ways of sharing scientific research. By sharing both data and modeling tools in a consistent framework, published (numerical) experiments can be made truly reproducible again.

  19. The RESET tephra database and associated analytical tools

    NASA Astrophysics Data System (ADS)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  20. Web Based Tools for Research and Teaching

    NASA Astrophysics Data System (ADS)

    Svirsky, E.; Hijazi, A.; Betterton, D.; Doxas, I.

    2005-05-01

    The Solar System Collaboratory is a web based set of tools that has been used for the past seven years in introductory classes in Astronomy, Physics, Environmental Science, and Engineering. The present paper will discuss the integration into the tool set of a recently developed Magnetospheric package. The package is written in Java 3D, and has a modular design, so that different models and datasets, both real-time and historical, can be seamlessly compared using a variety of goodness-of-fit measures. The package is used both in research and education at the undergraduate as well as secondary level. In addition to the science components, the package includes web based tools for conceptual student assessment, as well as resources for teachers, and videotaped case studies of classroom interactions.

  1. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  2. SEACOIN--an investigative tool for biomedical informatics researchers.

    PubMed

    Lee, Eva K; Lee, Hee-Rin; Quarshie, Alexander

    2011-01-01

    Peer-reviewed scientific literature is a prime source for accessing knowledge in the biomedical field. Its rapid growth and diverse domain coverage require systematic efforts in developing interactive tools for efficiently searching and summarizing current advances for acquiring knowledge and referencing, and for furthering scientific discovery. Although information retrieval systems exist, the conventional tools and systems remain difficult for biomedical investigators to use. There remain gaps even in the state-of-the-art systems as little attention has been devoted to understanding the needs of biomedical researchers. Our work attempts to bridge the gap between the needs of biomedical users and systems design efforts. We first study the needs of users and then design a simple visual analytic application tool, SEACOIN. A key motivation stems from biomedical researchers' request for a "simple interface" that is suitable for novice users in information technology. The system minimizes information overload, and allows users to search easily even in time-constrained situations. Users can manipulate the depth of information according to the purpose of usage. SEACOIN enables interactive exploration and filtering of search results via "metamorphose topological visualization" and "tag cloud," visualization tools that are commonly used in social network sites. We illustrate SEACOIN's usage through applications on PubMed publications on heart disease, cancer, Alzheimer's disease, diabetes, and asthma. PMID:22195132

  3. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  4. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  5. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  6. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  7. The Use of Economic Analytical Tools in Quantifying and Measuring Educational Benefits and Costs.

    ERIC Educational Resources Information Center

    Holleman, I. Thomas, Jr.

    The general objective of this study was to devise quantitative guidelines that school officials can accurately follow in using benefit-cost analysis, cost-effectiveness analysis, ratio analysis, and other similar economic analytical tools in their particular local situations. Specifically, the objectives were to determine guidelines for the…

  8. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION PLAN Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying out their responsibilities for implementing the Plan, the Corps of Engineers, the South Florida...

  9. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  10. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  11. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  12. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  13. Group analytic psychotherapy (im)possibilities to research

    PubMed Central

    Vlastelica, Mirela

    2011-01-01

    In the course of group analytic psychotherapy, where we discovered the power of the therapeutic effects, there occurred the need of group analytic psychotherapy researches. Psychotherapeutic work in general, and group psychotherapy in particular, are hard to measure and put into some objective frames. Researches, i. e. measuring of changes in psychotherapy is a complex task, and there are large disagreements. For a long time, the empirical-descriptive method was the only way of research in the field of group psychotherapy. Problems of researches in group psychotherapy in general, and particularly in group analytic psychotherapy can be reviewed as methodology problems at first, especially due to unrepeatability of the therapeutic process. The basic polemics about measuring of changes in psychotherapy is based on the question whether a change is to be measured by means of open measuring of behaviour or whether it should be evaluated more finely by monitoring inner psychological dimensions. Following the therapy results up, besides providing additional information on the patient's improvement, strengthens the psychotherapist's self-respect, as well as his respectability and credibility as a scientist. PMID:25478094

  14. Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.

    SciTech Connect

    Wong, Pak C.; Leung, Lai-Yung R.; Lu, Ning; Scott, Michael J.; Mackey, Patrick S.; Foote, Harlan P.; Correia, James; Taylor, Zachary T.; Xu, Jianhua; Unwin, Stephen D.; Sanfilippo, Antonio P.

    2009-09-01

    We describe our ongoing efforts to design and develop a collaborative visual analytics tool to interactively model social and technological change of our society in a future setting. The work involves an interdisciplinary team of scientists from atmospheric physics, electrical engineering, building engineering, social sciences, economics, public policy, and national security. The goal of the collaborative tool is to predict the impact of global climate change on the U.S. power grids and its implications for society and national security. These future scenarios provide critical assessment and information necessary for policymakers and stakeholders to help formulate a coherent, unified strategy toward shaping a safe and secure society. The paper introduces the problem background and related work, explains the motivation and rationale behind our design approach, presents our collaborative visual analytics tool and usage examples, and finally shares the development challenge and lessons learned from our investigation.

  15. SEACOIN – An Investigative Tool for Biomedical Informatics Researchers

    PubMed Central

    Lee, Eva K.; Lee, Hee-Rin; Quarshie, Alexander

    2011-01-01

    Peer-reviewed scientific literature is a prime source for accessing knowledge in the biomedical field. Its rapid growth and diverse domain coverage require systematic efforts in developing interactive tools for efficiently searching and summarizing current advances for acquiring knowledge and referencing, and for furthering scientific discovery. Although information retrieval systems exist, the conventional tools and systems remain difficult for biomedical investigators to use. There remain gaps even in the state-of-the-art systems as little attention has been devoted to understanding the needs of biomedical researchers. Our work attempts to bridge the gap between the needs of biomedical users and systems design efforts. We first study the needs of users and then design a simple visual analytic application tool, SEACOIN. A key motivation stems from biomedical researchers’ request for a “simple interface” that is suitable for novice users in information technology. The system minimizes information overload, and allows users to search easily even in time-constrained situations. Users can manipulate the depth of information according to the purpose of usage. SEACOIN enables interactive exploration and filtering of search results via “metamorphose topological visualization” and “tag cloud,” visualization tools that are commonly used in social network sites. We illustrate SEACOIN’s usage through applications on PubMed publications on heart disease, cancer, Alzheimer’s disease, diabetes, and asthma. PMID:22195132

  16. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  17. An Analytical Thermal Model for Autonomous Soaring Research

    NASA Technical Reports Server (NTRS)

    Allen, Michael

    2006-01-01

    A viewgraph presentation describing an analytical thermal model used to enable research on autonomous soaring for a small UAV aircraft is given. The topics include: 1) Purpose; 2) Approach; 3) SURFRAD Data; 4) Convective Layer Thickness; 5) Surface Heat Budget; 6) Surface Virtual Potential Temperature Flux; 7) Convective Scaling Velocity; 8) Other Calculations; 9) Yearly trends; 10) Scale Factors; 11) Scale Factor Test Matrix; 12) Statistical Model; 13) Updraft Strength Calculation; 14) Updraft Diameter; 15) Updraft Shape; 16) Smoothed Updraft Shape; 17) Updraft Spacing; 18) Environment Sink; 19) Updraft Lifespan; 20) Autonomous Soaring Research; 21) Planned Flight Test; and 22) Mixing Ratio.

  18. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits. PMID:27038058

  19. Tools and collaborative environments for bioinformatics research

    PubMed Central

    Giugno, Rosalba; Pulvirenti, Alfredo

    2011-01-01

    Advanced research requires intensive interaction among a multitude of actors, often possessing different expertise and usually working at a distance from each other. The field of collaborative research aims to establish suitable models and technologies to properly support these interactions. In this article, we first present the reasons for an interest of Bioinformatics in this context by also suggesting some research domains that could benefit from collaborative research. We then review the principles and some of the most relevant applications of social networking, with a special attention to networks supporting scientific collaboration, by also highlighting some critical issues, such as identification of users and standardization of formats. We then introduce some systems for collaborative document creation, including wiki systems and tools for ontology development, and review some of the most interesting biological wikis. We also review the principles of Collaborative Development Environments for software and show some examples in Bioinformatics. Finally, we present the principles and some examples of Learning Management Systems. In conclusion, we try to devise some of the goals to be achieved in the short term for the exploitation of these technologies. PMID:21984743

  20. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  1. An integrated analytic tool and knowledge-based system approach to aerospace electric power system control

    NASA Astrophysics Data System (ADS)

    Owens, William R.; Henderson, Eric; Gandikota, Kapal

    1986-10-01

    Future aerospace electric power systems require new control methods because of increasing power system complexity, demands for power system management, greater system size and heightened reliability requirements. To meet these requirements, a combination of electric power system analytic tools and knowledge-based systems is proposed. The continual improvement in microelectronic performance has made it possible to envision the application of sophisticated electric power system analysis tools to aerospace vehicles. These tools have been successfully used in the measurement and control of large terrestrial electric power systems. Among these tools is state estimation which has three main benefits. The estimator builds a reliable database for the system structure and states. Security assessment and contingency evaluation also require a state estimator. Finally, the estimator will, combined with modern control theory, improve power system control and stability. Bad data detection as an adjunct to state estimation identifies defective sensors and communications channels. Validated data from the analytic tools is supplied to a number of knowledge-based systems. These systems will be responsible for the control, protection, and optimization of the electric power system.

  2. An analytical tool that quantifies cellular morphology changes from three-dimensional fluorescence images.

    PubMed

    Haass-Koffler, Carolina L; Naeemuddin, Mohammad; Bartlett, Selena E

    2012-01-01

    detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells, however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics. PMID:22951512

  3. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  4. Telerehabilitation: Policy Issues and Research Tools

    PubMed Central

    Seelman, Katherine D.; Hartman, Linda M.

    2009-01-01

    The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR) policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF). Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. PMID:25945162

  5. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    PubMed

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results. PMID:27251852

  6. VAO Tools Enhance CANDELS Research Productivity

    NASA Astrophysics Data System (ADS)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  7. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  8. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  9. Ion mobility spectrometry as a high-throughput analytical tool in occupational pyrethroid exposure.

    PubMed

    Armenta, S; Blanco, M

    2012-08-01

    The capabilities of ion mobility spectrometry (IMS) as a high throughput and green analytical tool in the occupational health and safety control, using pyrethroids as models has been evidenced. The method used for dermal and inhalation exposure assessment is based on the passive pyrethroid sampling using Teflon membranes, direct thermal extraction of the pyrethroids, and measurement of the vaporized analytes by IMS without reagent and solvent consumption. The IMS signatures of the studied synthetic pyrethroids under atmospheric pressure chemical ionization by investigating the formed negative ion products have been obtained. The main advantages of the proposed procedure are related to the obtained limits of detection, ranging from 0.08 to 5 ng, the simplicity of measurement, the lack of sample treatment, and therefore, solvent consumption and waste generation, and finally, the speed of analysis. PMID:22159370

  10. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  11. Analytical tools for the analysis of β-carotene and its degradation products.

    PubMed

    Stutz, H; Bresgen, N; Eckl, P M

    2015-05-01

    β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation

  12. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  13. ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  14. Research on graphical workflow modeling tool

    NASA Astrophysics Data System (ADS)

    Gu, Hongjiu

    2013-07-01

    Through the technical analysis of existing modeling tools, combined with Web technology, this paper presents a graphical workflow modeling tool design program, through which designers can draw process directly in the browser and automatically transform the drawn process description in XML description file, to facilitate the workflow engine analysis and barrier-free sharing of workflow data in a networked environment. The program has software reusability, cross-platform, scalability, and strong practicality.

  15. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  16. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  17. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  18. Tools for Ephemeral Gully Erosion Process Research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  19. Participatory Research: A Tool for Extension Educators

    ERIC Educational Resources Information Center

    Tritz, Julie

    2014-01-01

    Given their positions in communities across the United States, Extension educators are poised to have meaningful partnerships with the communities they serve. This article presents a case for the use of participatory research, which is a departure from more conventional forms of research based on objectivity, researcher distance, and social…

  20. Rhodobase, a meta-analytical tool for reconstructing gene regulatory networks in a model photosynthetic bacterium.

    PubMed

    Moskvin, Oleg V; Bolotin, Dmitry; Wang, Andrew; Ivanov, Pavel S; Gomelsky, Mark

    2011-02-01

    We present Rhodobase, a web-based meta-analytical tool for analysis of transcriptional regulation in a model anoxygenic photosynthetic bacterium, Rhodobacter sphaeroides. The gene association meta-analysis is based on the pooled data from 100 of R. sphaeroides whole-genome DNA microarrays. Gene-centric regulatory networks were visualized using the StarNet approach (Jupiter, D.C., VanBuren, V., 2008. A visual data mining tool that facilitates reconstruction of transcription regulatory networks. PLoS ONE 3, e1717) with several modifications. We developed a means to identify and visualize operons and superoperons. We designed a framework for the cross-genome search for transcription factor binding sites that takes into account high GC-content and oligonucleotide usage profile characteristic of the R. sphaeroides genome. To facilitate reconstruction of directional relationships between co-regulated genes, we screened upstream sequences (-400 to +20bp from start codons) of all genes for putative binding sites of bacterial transcription factors using a self-optimizing search method developed here. To test performance of the meta-analysis tools and transcription factor site predictions, we reconstructed selected nodes of the R. sphaeroides transcription factor-centric regulatory matrix. The test revealed regulatory relationships that correlate well with the experimentally derived data. The database of transcriptional profile correlations, the network visualization engine and the optimized search engine for transcription factor binding sites analysis are available at http://rhodobase.org. PMID:21070832

  1. Narratives and Activity Theory as Reflective Tools in Action Research

    ERIC Educational Resources Information Center

    Stuart, Kaz

    2012-01-01

    Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…

  2. High temperature dielectric constant measurement - another analytical tool for ceramic studies?

    SciTech Connect

    Hutcheon, R.M.; Hayward, P.; Alexander, S.B.

    1995-12-31

    The automation of a high-temperature (1400{degrees}C), microwave-frequency, dielectric constant measurement system has dramatically increased the reproducibility and detail of data. One can now consider using the technique as a standard tool for analytical studies of low-conductivity ceramics and glasses. Simultaneous temperature and frequency scanning dielectric analyses (SDA) yield the temperature-dependent complex dielectric constant. The real part of the dielectric constant is especially sensitive to small changes in the distance and distribution of neighboring ions or atoms, while the absorptive part is strongly dependent on the position and population of electron/hole conduction bands, which are sensitive to impurity concentrations in the ceramic. SDA measurements on a few specific materials will be compared with standard differential thermal analysis (DTA) results and an attempt will be made to demonstrate the utility of both the common and complementary aspects of the techniques.

  3. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  4. Research issues in sustainable consumption: toward an analytical framework for materials and the environment.

    PubMed

    Thomas, Valerie M; Graedel, T E

    2003-12-01

    We define key research questions as a stimulus to research in the area of industrial ecology. The first group of questions addresses analytical support for green engineering and environmental policy. They relate to (i) tools for green engineering, (ii) improvements in life cycle assessment, (iii) aggregation of environmental impacts, and (iv) effectiveness of a range of innovative policy approaches. The second group of questions addresses the dynamics of technology, economics, and environmental impacts. They relate to (v) the environmental impacts of material and energy consumption, (vi) the potential for material efficiency, (vii) the relation of technological and economic development to changes in consumption patterns, and (viii) the potential for technology to overcome environmental impacts and constraints. Altogether, the questions create an intellectual agenda for industrial ecology and integrate the technological and social aspects of sustainability. PMID:14700323

  5. Raman microspectroscopy: a powerful analytic and imaging tool in petrology and geochemistry

    NASA Astrophysics Data System (ADS)

    Beyssac, O.

    2013-12-01

    Raman microspectroscopy is a vibrational spectroscopy based on the inelastic scattering of light interacting with molecules. This technique has benefited from recent developments in spectral and spatial resolution as well as sensitivity which make it widely used in Geosciences. A very attractive aspect of Raman spectroscopy is that it does not require any complex sample preparation. In addition, Raman imaging is now a routine and reliable technique which makes it competitive with SEM-EDS mapping for mineral mapping for instance. Raman microspectroscopy is a complementary technique to SEM, EMP, SIMS... as it can provide not only information on mineral chemistry, but overall on mineral structure. Raman Microspectroscopy is for instance the best in situ technique to distinguish mineral polymorphs. In addition the sensitivity of RM to mineral structure is extremely useful to study accessory minerals like oxides or sulphides as well as graphitic carbons. A brief presentation of the analytical capabilities of modern Raman spectroscopy will be presented. Then recent applications of RM in petrological and geochemical problems will be reviewed, including Raman imaging. The advantages and disadvantages of this technique compared to other micro-analytic tools will be discussed.

  6. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    PubMed

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-01

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  7. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  8. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  9. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    PubMed

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. PMID:26304440

  10. Interactive Assessment as a Research Tool.

    ERIC Educational Resources Information Center

    Haywood, H. Carl; Wingenfeld, Sabine A.

    1992-01-01

    This paper discusses dynamic/interactive approaches to psychological assessment based on the concept of induced change as a research tactic. Studies are reviewed showing how interactive assessment has yielded new knowledge in psychopathology; neuropsychology; learning disabilities; intelligence testing (in normal, deaf, and immigrant children);…

  11. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  12. Visualization tools for comprehensive test ban treaty research

    SciTech Connect

    Edwards, T.L.; Harris, J.M.; Simons, R.W.

    1997-08-01

    This paper focuses on tools used in Data Visualization efforts at Sandia National Laboratories under the Department of Energy CTBT R&D program. These tools provide interactive techniques for the examination and interpretation of scientific data, and can be used for many types of CTBT research and development projects. We will discuss the benefits and drawbacks of using the tools to display and analyze CTBT scientific data. While the tools may be used for everyday applications, our discussion will focus on the use of these tools for visualization of data used in research and verification of new theories. Our examples focus on uses with seismic data, but the tools may also be used for other types of data sets. 5 refs., 6 figs., 1 tab.

  13. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  14. Improving Teaching with Collaborative Action Research: An ASCD Action Tool

    ERIC Educational Resources Information Center

    Cunningham, Diane

    2011-01-01

    Once you've established a professional learning community (PLC), you need to get this ASCD (Association for Supervision and Curriculum Development) action tool to ensure that your PLC stays focused on addressing teaching methods and student learning problems. This ASCD action tool explains how your PLC can use collaborative action research to…

  15. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  16. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  17. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  18. Volume, Variety and Veracity of Big Data Analytics in NASA's Giovanni Tool

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Hegde, M.; Smit, C.; Pan, J.; Bryant, K.; Chidambaram, C.; Zhao, P.

    2013-12-01

    Earth Observation data have posed challenges to NASA users ever since the launch of several satellites around the turn of the century, generating volumes now measured in petabytes, a volume growth further increased by models assimilating the satellite data. One important approach to bringing Big Data Analytic capabilities to bear on the Volume of data has been the provision of server-side analysis capabilities. For instance, the Geospatial Interactive Online Visualization ANd aNalysis (Giovanni) tool provides a web interface to large volumes of gridded data from several EOSDIS data centers. Giovanni's main objective is to allow the user to explore its data holdings using various forms of visualization and data summarization or aggregation algorithms, thus allowing the user to examine statistics and pictures for the overall data, while eventually acquiring only the most useful data. Thus much of the preprocessing and data reduction aspects can take place on the server, delivering manageable information quantities to the user. In addition to Volume, Giovanni uses open standards to tackle the Variety aspect of Big Data, incorporating data stored in several formats, from several data centers, and making them available in a uniform data format and structure to both the Giovanni algorithms and the end user. The Veracity aspect of Big Data, perhaps the stickiest of wickets, is enhanced through features that enable reproducibility (provenance and URL-driven workflows), and by a Help Desk staffed by scientists with expertise in the science data.

  19. Common plants as alternative analytical tools to monitor heavy metals in soil

    PubMed Central

    2012-01-01

    Background Herbaceous plants are common vegetal species generally exposed, for a limited period of time, to bioavailable environmental pollutants. Heavy metals contamination is the most common form of environmental pollution. Herbaceous plants have never been used as natural bioindicators of environmental pollution, in particular to monitor the amount of heavy metals in soil. In this study, we aimed at assessing the usefulness of using three herbaceous plants (Plantago major L., Taraxacum officinale L. and Urtica dioica L.) and one leguminous (Trifolium pratense L.) as alternative indicators to evaluate soil pollution by heavy metals. Results We employed Inductively Coupled Plasma Atomic Emission Spectroscopy (ICP-AES) to assess the concentration of selected heavy metals (Cu, Zn, Mn, Pb, Cr and Pd) in soil and plants and we employed statistical analyses to describe the linear correlation between the accumulation of some heavy metals and selected vegetal species. We found that the leaves of Taraxacum officinale L. and Trifolium pratense L. can accumulate Cu in a linearly dependent manner with Urtica dioica L. representing the vegetal species accumulating the highest fraction of Pb. Conclusions In this study we demonstrated that common plants can be used as an alternative analytical tool for monitoring selected heavy metals in soil. PMID:22594441

  20. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    NASA Astrophysics Data System (ADS)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  1. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    PubMed

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. PMID:25016590

  2. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). PMID:26873463

  3. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  4. Innovations in scholarly communication - global survey on research tool usage

    PubMed Central

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents’ demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  5. Innovations in scholarly communication - global survey on research tool usage.

    PubMed

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents' demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  6. Experimental and Analytical Research on Fracture Processes in ROck

    SciTech Connect

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  7. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  8. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    ERIC Educational Resources Information Center

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  9. The use of metacognitive tools in a multidimensional research program

    NASA Astrophysics Data System (ADS)

    Iuli, Richard John

    Metacognition may be thought of as "cognition about cognition", or "thinking about thinking." A number of strategies and tools have been developed to help individuals understand the nature of knowledge, and to enhance their "thinking about thinking." Two metacognitive tools, concept maps and Gowin's Vee, were first developed for use in educational research. Subsequently, they were used successfully to help learners "learn how to learn." The success of metacognitive tools in educational settings suggests that they may help scientists understand the nature of knowledge production and organization, thereby facilitating their research activities and enhancing their understanding of the events and objects they study. In September 1993 I began an ethnographic, naturalistic study of the United States Department of Agriculture - Agricultural Research Service - Rhizobotany Project at Cornell University in Ithaca, NY. I spent the next two and one-half years as a participant observer with the Project. The focus of my research was to examine the application of metacognitive tools to an academic research setting. The knowledge claims that emerged from my research were: (1) Individual researchers tended to have narrow views of the Rhizobotany Project that centered on their individual areas of research; (2) The researchers worked in "conceptual isolation", or failing to see the connections and interrelatedness of their own work with the work of the others; (3) For those researchers who constructed concept maps and Vee diagrams, these heuristics helped them to build a deeper conceptual understanding of their own work; and (4) Half of the members of the research team did not find concept mapping and Vee diagramming useful. Their reluctance to use these tools was interpreted as an indication of epistemological confusion. The prevalence of conceptual isolation and epistemological confusion among members of the Rhizobotany Project parallels the results of previous studies that have

  10. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  11. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis. PMID:27391182

  12. Building a Better Bibliography: Computer-Aided Research Tools.

    ERIC Educational Resources Information Center

    Bloomfield, Elizabeth

    1989-01-01

    Describes a project at the University of Guelph (Ontario) that combined both bibliographical and archival references in one large machine readable database to facilitate local history research. The description covers research tool creation, planning activities, system design, the database management system used, material selection, record…

  13. The WWW Cabinet of Curiosities: A Serendipitous Research Tool

    ERIC Educational Resources Information Center

    Arnold, Josie

    2012-01-01

    This paper proposes that the WWW is able to be fruitfully understood as a research tool when we utilise the metaphor of the cabinet of curiosities, the wunderkammer. It unpeels some of the research attributes of the metaphor as it reveals the multiplicity of connectivity on the web that provides serendipitous interactions between unexpected…

  14. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  15. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  16. Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems

    PubMed Central

    Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-01-01

    Background Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. Objective The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Methods Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Results Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Conclusions Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the

  17. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    PubMed

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  18. A factor analytic study of midwives' attitudes to research.

    PubMed

    Hicks, C

    1995-03-01

    Midwives are increasingly being encouraged to undertake research activities at various levels, as a routine part of their job. However, despite topdown directives to this end, there is some evidence that relatively little research is published and reasons, such as lack of time, confidence and skill have been put forward to explain the short-fall. While these reasons may be valid obstacles, it is also conceivable that they are manifestations of a set of underlying attitudes to research in midwifery. If attitudes are assumed to be predictors of behaviour, then it may be relevant to study midwives' attitudes to research more closely in order to identify whether these are responsible in some measure for the short-fall in research output. To this end, a national survey of 397 midwives was undertaken to establish their attitudes to research. The results were subjected to factor analysis, using an orthogonal solution, in order to establish whether any coherent source components existed in the sample's attitude responses. The factor analysis yielded four coherent factors, which were (i) other health care professionals' views of the value of midwifery research; (ii) the value of research for midwifery practice; (iii) the research role of the midwife; (iv) midwives' competence to carry out research. On further analysis the first two factors were also found to be significantly related to midwives' likelihood of undertaking research and publishing research findings. These factors could form the basis for future attitude change and staff development and training programmes as a means by which midwifery research output could be increased. PMID:7731371

  19. Dataset-Driven Research to Support Learning and Knowledge Analytics

    ERIC Educational Resources Information Center

    Verbert, Katrien; Manouselis, Nikos; Drachsler, Hendrik; Duval, Erik

    2012-01-01

    In various research areas, the availability of open datasets is considered as key for research and application purposes. These datasets are used as benchmarks to develop new algorithms and to compare them to other algorithms in given settings. Finding such available datasets for experimentation can be a challenging task in technology enhanced…

  20. AN ANALYTICAL APPROACH TO RESEARCH ON INSTRUCTIONAL METHODS.

    ERIC Educational Resources Information Center

    GAGE, N.L.

    THE APPROACH USED AT STANFORD UNIVERSITY TO RESEARCH ON TEACHING WAS DISCUSSED, AND THE AUTHOR EXPLAINED THE CONCEPTS OF "TECHNICAL SKILLS,""MICROTEACHING," AND "MICROCRITERIA" THAT WERE THE BASIS OF THE DEVELOPMENT OF THIS APPROACH TO RESEARCH AND TO STANFORD'S SECONDARY-TEACHER EDUCATION PROGRAM. THE AUTHOR PRESENTED A BASIC DISTINCTION BETWEEN…

  1. Meta-Analytic Research on the Outcomes of Outdoor Education.

    ERIC Educational Resources Information Center

    Neill, James T.

    This paper compares and summarizes empirical research on the outcomes of outdoor education (OE) and related programs. Most frequently, OE outcomes have been researched using post-program surveys of staff and participant attitudes. Such reports are vulnerable to many potential distortions. A second major approach to examining OE effectiveness…

  2. Participant-Centric Initiatives: Tools to Facilitate Engagement In Research

    PubMed Central

    Anderson, Nicholas; Bragg, Caleb; Hartzler, Andrea; Edwards, Kelly

    2014-01-01

    Clinical genomic research faces increasing challenges in establishing participant privacy and consent processes that facilitate meaningful choice and communication capacity for longitudinal and secondary research uses. There are an evolving range of participant-centric initiatives that combine web-based informatics tools with new models of engagement and research collaboration. These emerging initiatives may become valuable approaches to support large-scale and longitudinal research studies. We highlight and discuss four types of emerging initiatives for engaging and sustaining participation in research. PMID:24772384

  3. Identifying and Tracing Persistent Identifiers of Research Resources : Automation, Metrics and Analytics

    NASA Astrophysics Data System (ADS)

    Maull, K. E.; Hart, D.; Mayernik, M. S.

    2015-12-01

    Formal and informal citations and acknowledgements for research infrastructures, such as data collections, software packages, and facilities, are an increasingly important function of attribution in scholarly literature. While such citations provide the appropriate links, even if informally, to their origins, they are often done so inconsistently, making such citations hard to analyze. While significant progress has been made in the past few years in the development of recommendations, policies, and procedures for creating and promoting citable identifiers, progress has been mixed in tracking how data sets and other digital infrastructures have actually been identified and cited in the literature. Understanding the full extent and value of research infrastructures through the lens of scholarly literature requires significant resources, and thus, we argue must rely on automated approaches that mine and track persistent identifiers to scientific resources. Such automated approaches, however, face a number of unique challenges, from the inconsistent and informal referencing practices of authors, to unavailable, embargoed or hard-to-obtain full-text resources for text analytics, to inconsistent and capricious impact metrics. This presentation will discuss work to develop and evaluate tools for automating the tracing of research resource identification and referencing in the research literature via persistent citable identifiers. Despite the impediments, automated processes are of considerable importance in enabling these traceability efforts to scale, as the numbers of identifiers being created for unique scientific resources continues to grow rapidly. Such efforts, if successful, should improve the ability to answer meaningful questions about research resources as they continue to grow as a target of advanced analyses in research metrics.

  4. Non-invasive tools for measuring metabolism and biophysical analyte transport: self-referencing physiological sensing.

    PubMed

    McLamore, Eric S; Porterfield, D Marshall

    2011-11-01

    Biophysical phenomena related to cellular biochemistry and transport are spatially and temporally dynamic, and are directly involved in the regulation of physiology at the sub-cellular to tissue spatial scale. Real time monitoring of transmembrane transport provides information about the physiology and viability of cells, tissues, and organisms. Combining information learned from real time transport studies with genomics and proteomics allows us to better understand the functional and mechanistic aspects of cellular and sub-cellular systems. To accomplish this, ultrasensitive sensing technologies are required to probe this functional realm of biological systems with high temporal and spatial resolution. In addition to ongoing research aimed at developing new and enhanced sensors (e.g., increased sensitivity, enhanced analyte selectivity, reduced response time, and novel microfabrication approaches), work over the last few decades has advanced sensor utility through new sensing modalities that extend and enhance the data recorded by sensors. A microsensor technique based on phase sensitive detection of real time biophysical transport is reviewed here. The self-referencing technique converts non-invasive extracellular concentration sensors into dynamic flux sensors for measuring transport from the membrane to the tissue scale. In this tutorial review, we discuss the use of self-referencing micro/nanosensors for measuring physiological activity of living cells/tissues in agricultural, environmental, and biomedical applications comprehensible to any scientist/engineer. PMID:21761069

  5. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  6. Research Tools Available at the Community Coordinated Modeling Center

    NASA Astrophysics Data System (ADS)

    Berrios, D. H.; Maddox, M.; Rastaetter, L.; Chulaki, A.; Hesse, M.

    2007-12-01

    The Community Coordinated Modeling Center (CCMC), located at NASA Goddard Space Flight Center, provides access to state-of-the-art space weather models to the research community. The majority of the models residing at the CCMC are comprehensive computationally intensive physics-based models. The CCMC also provides free services and tools to assist the research community in analyzing the results from the space weather model simulations. We present an overview of the available services at the CCMC: the Runs-On-Request system, the online visualizations, the Kameleon access and interpolation library, and the CCMC Space Weather Widget. Finally, we discuss the future services and tools in development.

  7. Research for research: tools for knowledge discovery and visualization.

    PubMed Central

    Van Mulligen, Erik M.; Van Der Eijk, Christiaan; Kors, Jan A.; Schijvenaars, Bob J. A.; Mons, Barend

    2002-01-01

    This paper describes a method to construct from a set of documents a spatial representation that can be used for information retrieval and knowledge discovery. The proposed method has been implemented in a prototype system and allows the researcher to browse interactively and in real-time a network of relationships obtained from a set of full text articles. These relationships are combined with the potential relationships between concepts as defined in the UMLS semantic network. The browser allows the user to select a seed term and find all related concepts, to find a path between concepts (hypothesis testing), and to retrieve the references to documents or database entries that support the relationship between concepts. PMID:12463942

  8. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    NASA Astrophysics Data System (ADS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-03-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)SJNCAS0038-5506][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)SPHJAR0038-5646] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)EPCFFB1434-604410.1140/epjc/s10052-009-1195-8][M. M. Block, Eur. Phys. J. C 68, 683 (2010)EPCFFB1434-604410.1140/epjc/s10052-010-1374-7] allow us to write fully decoupled solutions for the singlet structure function Fs(x,Q2) and G(x,Q2) as Fs(x,Q2)=Fs(Fs0(x0),G0(x0)) and G(x,Q2)=G(Fs0(x0),G0(x0)), where the x0 are the Bjorken x values at Q02. Here Fs and G are known functions—found using LO DGLAP splitting functions—of the initial boundary conditions Fs0(x)≡Fs(x,Q02) and G0(x)≡G(x,Q02), i.e., the chosen starting functions at the virtuality Q02. For both G(x) and Fs(x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy—a computational fractional precision of O(10-9). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet Fs distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)EPCFFB1434-604410.1140/epjc/s10052-009-1072-5], starting from their initial values at Q02=1GeV2 and 1.69GeV2, respectively, using their choice of αs(Q2). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and Fs satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of

  9. International News Communication Research: A Meta-Analytic Assessment.

    ERIC Educational Resources Information Center

    Tsang, Kuo-jen

    A survey of "Journalism Quarterly,""Gazette,""Public Opinion Quarterly,""Journal of Broadcasting," and "Journal of Communication" reveals that the early research on international news flow or coverage emphasized two aspects of news: (1) how the United States was portrayed in the media of other nations, and (2) what the effect of American society…

  10. Trends in Behavior-Analytic Gambling Research and Treatment.

    PubMed

    Dixon, Mark R; Whiting, Seth W; Gunnarsson, Karl F; Daar, Jacob H; Rowsey, Kyle E

    2015-10-01

    The purpose of the present review was to analyze research outcomes for all gambling studies reported in the behavior analysis literature. We used the search term "gambling" to identify articles that were published in behaviorally oriented journals between the years 1992 and 2012 and categorized the content of each article as empirical or conceptual. Next, we examined and categorized the empirical articles by inclusion of an experimental manipulation and treatment to alleviate at least some aspect of pathological gambling, participant population used, type of gambling task employed in the research, whether the participants in the study actually gambled, and the behavioral phenomena of interest. The results show that the rate of publication of gambling research has increased in the last 6 years, and a vast majority of articles are empirical. Of the empirical articles, examinations of treatment techniques or methods are scarce; slot machine play is the most represented form of gambling, and slightly greater than half of the research included compensation based on gambling outcomes within experiments. We discuss implications and future directions based on these observations of the published literature. PMID:27606170

  11. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  12. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  13. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  14. Using High-Tech Tools for Student Research.

    ERIC Educational Resources Information Center

    Plati, Thomas

    1988-01-01

    Discusses incorporating high technology research tools into the curriculum for grades 5 through 12 in Shrewsbury, Massachusetts, public schools. The use of CD-ROM and online databases is described, teacher training is discussed, and steps to integrate this new technology are listed, including budget proposals and evaluation. (LRW)

  15. Database Advisor: A New Tool for K-12 Research Projects.

    ERIC Educational Resources Information Center

    Berteaux, Susan S.; Strong, Sandra S.

    The Database Advisor (DBA) is a tool designed to guide users to the most appropriate World Wide Web-based databases for their research. Developed in 1997 by the Science Libraries at the University of California, San Diego (UCSD), DBA is a Web-based front-end to bibliographic and full-text databases to which UCSD has remote access. DBA allows the…

  16. Measurement and Research Tools. Symposium 37. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This symposium on measurement and research tools consists of three presentations. "An Examination of the Multiple Intelligences Developmental Assessment Scales (MIDAS)" (Albert Wiswell et al.) explores MIDAS's psychometric saliency. Findings indicates this instrument represents an incomplete attempt to develop a valid assessment of multiple…

  17. Selecting the Right Tool: Comparison of the Analytical Performance of Infrared Attenuated Total Reflection Accessories.

    PubMed

    Schädle, Thomas; Mizaikoff, Boris

    2016-06-01

    The analytical performance of four commercially available infrared attenuated total reflection (IR-ATR) accessories with various ATR waveguide materials has been analyzed and evaluated using acetate, CO2, and CO3 (2-) solutions. Calibration functions have been established to determine and compare analytically relevant parameters such as sensitivity, signal-to-noise ratio (SNR), and efficiency. The obtained parameters were further analyzed to support conclusions on the differences in performance of the individual IR-ATR accessories. PMID:27091901

  18. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  19. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  20. DDBJ launches a new archive database with analytical tools for next-generation sequence data.

    PubMed

    Kaminuma, Eli; Mashima, Jun; Kodama, Yuichi; Gojobori, Takashi; Ogasawara, Osamu; Okubo, Kousaku; Takagi, Toshihisa; Nakamura, Yasukazu

    2010-01-01

    The DNA Data Bank of Japan (DDBJ) (http://www.ddbj.nig.ac.jp) has collected and released 1,701,110 entries/1,116,138,614 bases between July 2008 and June 2009. A few highlighted data releases from DDBJ were the complete genome sequence of an endosymbiont within protist cells in the termite gut and Cap Analysis Gene Expression tags for human and mouse deposited from the Functional Annotation of the Mammalian cDNA consortium. In this period, we started a novel user announcement service using Really Simple Syndication (RSS) to deliver a list of data released from DDBJ on a daily basis. Comprehensive visualization of a DDBJ release data was attempted by using a word cloud program. Moreover, a new archive for sequencing data from next-generation sequencers, the 'DDBJ Read Archive' (DRA), was launched. Concurrently, for read data registered in DRA, a semi-automatic annotation tool called the 'DDBJ Read Annotation Pipeline' was released as a preliminary step. The pipeline consists of two parts: basic analysis for reference genome mapping and de novo assembly and high-level analysis of structural and functional annotations. These new services will aid users' research and provide easier access to DDBJ databases. PMID:19850725

  1. SIGAPS: a prototype of bibliographic tool for medical research evaluation.

    PubMed

    Devos, P; Dufresne, E; Renard, J M; Beuscart, R

    2003-01-01

    Evaluation of research activity is extremely important but remains a complex domain. There's no standardized methods and evaluation is often based on the scientific publications. It is easy to identify, for a researcher, all the publications realized over a given period of time. At the level of an important establishment like an University Hospital, with about 500 researchers, this sort of inventory is very difficult to realize: we have to list the researchers, to list their publications, to determine the quality of articles produced, to store retrieved data and to calculate summary statistics. We have developed a full-Web prototype, using free software which, for a given researchers' list, interrogates the Pubmed server, downloads the found references and stores them in a local database. They are then enriched with local data which allow the realization of more or less complex analyses, the automatic production of reports, or keyword search. This tool is very easy to use, allowing for immediate analysis of publications of a researcher or a research team. This tool will allow to identify those active teams to be maintained or emergent teams to be supported. It will also allow to compare candidate profiles for appointments to research posts. PMID:14664073

  2. Echocardiography as a Research and Clinical Tool in Veterinary Medicine

    PubMed Central

    Allen, D. G.

    1982-01-01

    Echocardiography is the accepted term for the study of cardiac ultrasound. Although a relatively new tool for the study of the heart in man it has already found wide acceptance in the area of cardiac research and in the study of clinical cardiac disease. Animals had often been used in the early experiments with cardiac ultrasound, but only recently has echocardiography been used as a research and clinical tool in veterinary medicine. In this report echocardiography is used in the research of anesthetic effects on ventricular function and clinically in the diagnosis of congestive cardiomyopathy in a cat, ventricular septal defect in a calf, and pericardial effusion in a dog. Echocardiography is now an important adjunct to the field of veterinary cardiology. ImagesFigure 7.Figure 8.Figure 9.Figure 10. PMID:17422196

  3. Technical phosphoproteomic and bioinformatic tools useful in cancer research.

    PubMed

    López, Elena; Wesselink, Jan-Jaap; López, Isabel; Mendieta, Jesús; Gómez-Puertas, Paulino; Muñoz, Sarbelio Rodríguez

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  4. Technical phosphoproteomic and bioinformatic tools useful in cancer research

    PubMed Central

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  5. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  6. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  7. Analytical ultracentrifugation: A versatile tool for the characterisation of macromolecular complexes in solution.

    PubMed

    Patel, Trushar R; Winzor, Donald J; Scott, David J

    2016-02-15

    Analytical ultracentrifugation, an early technique developed for characterizing quantitatively the solution properties of macromolecules, remains a powerful aid to structural biologists in their quest to understand the formation of biologically important protein complexes at the molecular level. Treatment of the basic tenets of the sedimentation velocity and sedimentation equilibrium variants of analytical ultracentrifugation is followed by considerations of the roles that it, in conjunction with other physicochemical procedures, has played in resolving problems encountered in the delineation of complex formation for three biological systems - the cytoplasmic dynein complex, mitogen-activated protein kinase (ERK2) self-interaction, and the terminal catalytic complex in selenocysteine synthesis. PMID:26555086

  8. Radcalc: An Analytical Tool for Shippers of Radioactive Material and Waste

    SciTech Connect

    Kapoor, A.K.; Stuhl, L.A.

    2008-07-01

    The U.S. Department of Energy (DOE) ships radioactive materials in support of its research and development, environmental restoration, and national defense activities. The Radcalc software program assists personnel working on behalf of DOE in packaging and transportation determinations (e.g., isotopic decay, decay heat, regulatory classification, and gas generation) for shipment of radioactive materials and waste. Radcalc performs: - The U.S. Department of Transportation determinations and classifications (i.e., activity concentration for exempt material Type A or B, effective A1/A2, limited quantity, low specific activity, highway route controlled quantity, fissile quantity, fissile excepted, reportable quantity, list of isotopes required on shipping papers) - DOE calculations (i.e., transuranic waste, Pu-239 equivalent curies, fissile-gram equivalents) - The U.S. Nuclear Regulatory Commission packaging category (i.e., Category I, II, or III) - Dose-equivalent curie calculations - Radioactive decay calculations using a novel decay methodology and a decay data library of 1,867 isotopes typical of the range of materials encountered in DOE laboratory environments - Hydrogen and helium gas calculations - Pressure calculations. Radcalc is a validated and cost-effective tool to provide consistency, accuracy, reproducibility, timeliness, quality, compliance, and appropriate documentation to shippers of radioactive materials and waste at DOE facilities nationwide. Hundreds of shippers and engineers throughout the DOE Complex routinely use this software to automate various determinations and to validate compliance with the regulations. The effective use of software by DOE sites contributes toward minimizing risk involved in radioactive waste shipments and assuring the safety of workers and the public. (authors)

  9. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    SciTech Connect

    D.W. Hayden

    2005-02-01

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried to develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of

  10. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  11. Reconceptualizing vulnerability: deconstruction and reconstruction as a postmodern feminist analytical research method.

    PubMed

    Glass, Nel; Davis, Kierrynn

    2004-01-01

    Nursing research informed by postmodern feminist perspectives has prompted many debates in recent times. While this is so, nurse researchers who have been tempted to break new ground have had few examples of appropriate analytical methods for a research design informed by the above perspectives. This article presents a deconstructive/reconstructive secondary analysis of a postmodern feminist ethnography in order to provide an analytical exemplar. In doing so, previous notions of vulnerability as a negative state have been challenged and reconstructed. PMID:15206680

  12. CaMKII inhibitors: from research tools to therapeutic agents

    PubMed Central

    Pellicena, Patricia; Schulman, Howard

    2014-01-01

    The cardiac field has benefited from the availability of several CaMKII inhibitors serving as research tools to test putative CaMKII pathways associated with cardiovascular physiology and pathophysiology. Successful demonstrations of its critical pathophysiological roles have elevated CaMKII as a key target in heart failure, arrhythmia, and other forms of heart disease. This has caught the attention of the pharmaceutical industry, which is now racing to develop CaMKII inhibitors as safe and effective therapeutic agents. While the first generation of CaMKII inhibitor development is focused on blocking its activity based on ATP binding to its catalytic site, future inhibitors can also target sites affecting its regulation by Ca2+/CaM or translocation to some of its protein substrates. The recent availability of crystal structures of the kinase in the autoinhibited and activated state, and of the dodecameric holoenzyme, provides insights into the mechanism of action of existing inhibitors. It is also accelerating the design and development of better pharmacological inhibitors. This review examines the structure of the kinase and suggests possible sites for its inhibition. It also analyzes the uses and limitations of current research tools. Development of new inhibitors will enable preclinical proof of concept tests and clinical development of successful lead compounds, as well as improved research tools to more accurately examine and extend knowledge of the role of CaMKII in cardiac health and disease. PMID:24600394

  13. Using the Virtual Solar Observatory as a multifaceted research tool

    NASA Astrophysics Data System (ADS)

    Davey, A. A.

    2008-12-01

    The original premise behind the VxO movement was to provide a common interface to heteregenous datasets no matter the physical location of those datasets. This came with the implicit promise that the VxOs would help scientists to do science. This promise has been achieved to lesser and greater degrees and sometimes in ways that we didn't envisage. The current trend in Heliophsyics research is one that spans multiple instruments and often multiple physical realms, from the Sun to the earth. The VxOs are again helping scientists to do science by developing tools to allow VxOs to cross their specific physical regime and access and retrieve data from other VxOs serving their own particular regime. VxOs are also moving forward in providing research tools that allow large scale statistical studies. Using the VSO as an example I will demonstrate how VxOs continue to fulfil their original premise in new and innovative ways, provide tools for science research, and adapt to the challenges prresented to us by a mission such as SDO.

  14. FOSS Tools for Research Infrastructures - A Success Story?

    NASA Astrophysics Data System (ADS)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  15. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  16. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  17. The use of analytical surface tools in the fundamental study of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    This paper reviews the various techniques and surface tools available for the study of the atomic nature of the wear of materials. These include chemical etching, X-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which effect wear such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  18. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  19. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  20. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  1. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  2. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  3. Analytical Tools To Distinguish the Effects of Localization Error, Confinement, and Medium Elasticity on the Velocity Autocorrelation Function

    PubMed Central

    Weber, Stephanie C.; Thompson, Michael A.; Moerner, W.E.; Spakowitz, Andrew J.; Theriot, Julie A.

    2012-01-01

    Single particle tracking is a powerful technique for investigating the dynamic behavior of biological molecules. However, many of the analytical tools are prone to generate results that can lead to mistaken interpretations of the underlying transport process. Here, we explore the effects of localization error and confinement on the velocity autocorrelation function, Cυ. We show that calculation of Cυ across a range of discretizations can distinguish the effects of localization error, confinement, and medium elasticity. Thus, under certain regimes, Cυ can be used as a diagnostic tool to identify the underlying mechanism of anomalous diffusion. Finally, we apply our analysis to experimental data sets of chromosomal loci and RNA-protein particles in Escherichia coli. PMID:22713559

  4. The Research Tools of the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Berriman, G. B.; Lazio, T. J.; Project, VAO

    2013-01-01

    Astronomy is being transformed by the vast quantities of data, models, and simulations that are becoming available to astronomers at an ever-accelerating rate. The U.S. Virtual Astronomical Observatory (VAO) has been funded to provide an operational facility that is intended to be a resource for discovery and access of data, and to provide science services that use these data. Over the course of the past year, the VAO has been developing and releasing for community use five science tools: 1) "Iris", for dynamically building and analyzing spectral energy distributions, 2) a web-based data discovery tool that allows astronomers to identify and retrieve catalog, image, and spectral data on sources of interest, 3) a scalable cross-comparison service that allows astronomers to conduct pair-wise positional matches between very large catalogs stored remotely as well as between remote and local catalogs, 4) time series tools that allow astronomers to compute periodograms of the public data held at the NASA Star and Exoplanet Database (NStED) and the Harvard Time Series Center, and 5) A VO-aware release of the Image Reduction and Analysis Facility (IRAF) that provides transparent access to VO-available data collections and is SAMP-enabled, so that IRAF users can easily use tools such as Aladin and Topcat in conjuction with IRAF tasks. Additional VAO services will be built to make it easy for researchers to provide access to their data in VO-compliant ways, to build VO-enabled custom applications in Python, and to respond generally to the growing size and complexity of astronomy data. Acknowledgements: The Virtual Astronomical Observatory (VAO) is managed by the VAO, LLC, a non-profit company established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. The VAO is sponsored by the National Science Foundation and the National Aeronautics and Space Administration.

  5. [EpiInfo as a research and teaching tool in epidemiology and statistics: strengths and weaknesses].

    PubMed

    Mannocci, Alice; Bontempi, Claudio; Giraldi, Guglielmo; Chiaradia, Giacomina; de Waure, Chiara; Sferrazza, Antonella; Ricciardi, Walter; Boccia, Antonio; La Torre, Giuseppe

    2012-01-01

    EpiInfo is a free software developed in 1988 by the Centers for Disease Control and Prevention (CDC) in Atlanta to facilitate field epidemiological investigations and statistical analysis. The aim of this study was to assess whether the software represents, in the Italian biomedical field, an effective analytical research tool and a practical and simple epidemiology and biostatistics teaching tool. A questionnaire consisting of 20 multiple-choice and open questions was administered to 300 healthcare workers, including doctors, biologists, nurses, medical students and interns, at the end of a CME course in epidemiology and biostatistics. Sixty-four percent of participants were aged between 26 and 45 years, 52% were women and 73% were unmarried. Results show that women are more likely to utilize EpiInfo in their research activities with respect to men (p = 0.023), as are individuals aged 26-45 years with respect to the older and younger age groups (p = 0.023) and unmarried participants with respect to those married (p = 0.010). Thirty-one percent of respondents consider EpiInfo to be more than adequate for analysis of their research data and 52% consider it to be sufficiently so. The inclusion of an EpiInfo course in statistics and epidemiology modules facilitates the understanding of theoretical concepts and allows researchers to more easily perform some of the clinical/epidemiological research activities. PMID:22507994

  6. Applying stable isotopes to examine food-web structure: an overview of analytical tools.

    PubMed

    Layman, Craig A; Araujo, Marcio S; Boucek, Ross; Hammerschlag-Peyer, Caroline M; Harrison, Elizabeth; Jud, Zachary R; Matich, Philip; Rosenblatt, Adam E; Vaudo, Jeremy J; Yeager, Lauren A; Post, David M; Bearhop, Stuart

    2012-08-01

    Stable isotope analysis has emerged as one of the primary means for examining the structure and dynamics of food webs, and numerous analytical approaches are now commonly used in the field. Techniques range from simple, qualitative inferences based on the isotopic niche, to Bayesian mixing models that can be used to characterize food-web structure at multiple hierarchical levels. We provide a comprehensive review of these techniques, and thus a single reference source to help identify the most useful approaches to apply to a given data set. We structure the review around four general questions: (1) what is the trophic position of an organism in a food web?; (2) which resource pools support consumers?; (3) what additional information does relative position of consumers in isotopic space reveal about food-web structure?; and (4) what is the degree of trophic variability at the intrapopulation level? For each general question, we detail different approaches that have been applied, discussing the strengths and weaknesses of each. We conclude with a set of suggestions that transcend individual analytical approaches, and provide guidance for future applications in the field. PMID:22051097

  7. Designing and implementing full immersion simulation as a research tool.

    PubMed

    Munroe, Belinda; Buckley, Thomas; Curtis, Kate; Morris, Richard

    2016-05-01

    Simulation is a valuable research tool used to evaluate the clinical performance of devices, people and systems. The simulated setting may address concerns unique to complex clinical environments such as the Emergency Department, which make the conduct of research challenging. There is limited evidence available to inform the development of simulated clinical scenarios for the purpose of evaluating practice in research studies, with the majority of literature focused on designing simulated clinical scenarios for education and training. Distinct differences exist in scenario design when implemented in education compared with use in clinical research studies. Simulated scenarios used to assess practice in clinical research must not comprise of any purposeful or planned teaching and be developed with a high degree of validity and reliability. A new scenario design template was devised to develop two standardised simulated clinical scenarios for the evaluation of a new assessment framework for emergency nurses. The scenario development and validation processes undertaken are described and provide an evidence-informed guide to scenario development for future clinical research studies. PMID:26917415

  8. Natural Language Thesaurus: A Survey of Student Research Skills and Research Tool Preferences

    ERIC Educational Resources Information Center

    Redfern, Victoria

    2004-01-01

    This paper reports the results of a University of Canberra Library survey of student research knowledge, skills, tools and resources. Students are experiencing difficulties interrogating databases, the internet and library catalogues because of the lack of consistency in terminology and various methods of interrogation. This research was an…

  9. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research

    PubMed Central

    Torous, John; Kiang, Mathew V; Lorme, Jeanette

    2016-01-01

    Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Objective Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. Methods We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. Results We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Conclusions Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health. PMID:27150677

  10. Tissue fluid pressures - From basic research tools to clinical applications

    NASA Technical Reports Server (NTRS)

    Hargens, Alan R.; Akeson, Wayne H.; Mubarak, Scott J.; Owen, Charles A.; Gershuni, David H.

    1989-01-01

    This paper describes clinical applications of two basic research tools developed and refined in the past 20 years: the wick catheter (for measuring tissue fluid pressure) and the colloid osmometer (for measuring osmotic pressure). Applications of the osmometer include estimations of the reduced osmotic pressure of sickle-cell hemoglobin with deoxygenation, and of reduced swelling pressure of human nucleus pulposus with hydration or upon action of certain enzymes. Clinical uses of the wick-catheter technique include an improvement of diagnosis and treatment of acute and chronic compartment syndromes, the elucidation of the tissue pressure thresholds for neuromuscular dysfunction, and the development of a better tourniquet for orthopedics.

  11. Vaccinia Virus: A Tool for Research and Vaccine Development

    NASA Astrophysics Data System (ADS)

    Moss, Bernard

    1991-06-01

    Vaccinia virus is no longer needed for smallpox immunization, but now serves as a useful vector for expressing genes within the cytoplasm of eukaryotic cells. As a research tool, recombinant vaccinia viruses are used to synthesize biologically active proteins and analyze structure-function relations, determine the targets of humoral- and cell-mediated immunity, and investigate the immune responses needed for protection against specific infectious diseases. When more data on safety and efficacy are available, recombinant vaccinia and related poxviruses may be candidates for live vaccines and for cancer immunotherapy.

  12. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance. PMID:23586318

  13. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  14. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  15. Analytical aerodynamic model of a high alpha research vehicle wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Cao, Jichang; Garrett, Frederick, Jr.; Hoffman, Eric; Stalford, Harold

    1990-01-01

    A 6 DOF analytical aerodynamic model of a high alpha research vehicle is derived. The derivation is based on wind-tunnel model data valid in the altitude-Mach flight envelope centered at 15,000 ft altitude and 0.6 Mach number with Mach range between 0.3 and 0.9. The analytical models of the aerodynamics coefficients are nonlinear functions of alpha with all control variable and other states fixed. Interpolation is required between the parameterized nonlinear functions. The lift and pitching moment coefficients have unsteady flow parts due to the time range of change of angle-of-attack (alpha dot). The analytical models are plotted and compared with their corresponding wind-tunnel data. Piloted simulated maneuvers of the wind-tunnel model are used to evaluate the analytical model. The maneuvers considered are pitch-ups, 360 degree loaded and unloaded rolls, turn reversals, split S's, and level turns. The evaluation finds that (1) the analytical model is a good representation at Mach 0.6, (2) the longitudinal part is good for the Mach range 0.3 to 0.9, and (3) the lateral part is good for Mach numbers between 0.6 and 0.9. The computer simulations show that the storage requirement of the analytical model is about one tenth that of the wind-tunnel model and it runs twice as fast.

  16. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  17. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    ERIC Educational Resources Information Center

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  18. The Effects of Incentives on Workplace Performance: A Meta-Analytic Review of Research Studies

    ERIC Educational Resources Information Center

    Condly, Steven J.; Clark, Richard E.; Stolovitch, Harold D.

    2003-01-01

    A meta-analytic review of all adequately designed field and laboratory research on the use of incentives to motivate performance is reported. Of approximately 600 studies, 45 qualified. The overall average effect of all incentive programs in all work settings and on all work tasks was a 22% gain in performance. Team-directed incentives had a…

  19. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  20. Instruments Used in Doctoral Dissertations in Educational Sciences in Turkey: Quality of Research and Analytical Errors

    ERIC Educational Resources Information Center

    Karadag, Engin

    2011-01-01

    The aim of this study was to define the level of quality and types of analytical errors for measurement instruments used [i.e., interview forms, achievement tests and scales] in doctoral dissertations produced in educational sciences in Turkey. The study was designed to determine the levels of factors concerning quality in research methods and the…

  1. Evaluating the Effectiveness of Premarital Prevention Programs: A Meta-Analytic Review of Outcome Research.

    ERIC Educational Resources Information Center

    Carroll, Jason S.; Doherty, William J.

    2003-01-01

    Presents a comprehensive, meta-analytic review and critical evaluation of outcome research pertaining to the effectiveness of premarital prevention programs. Findings suggest that premarital prevention programs are generally effective in producing immediate and short-term gains in interpersonal skills and overall relationship quality. (Contains 67…

  2. Mineotaur: a tool for high-content microscopy screen sharing and visual analytics.

    PubMed

    Antal, Bálint; Chessel, Anatole; Carazo Salas, Rafael E

    2015-01-01

    High-throughput/high-content microscopy-based screens are powerful tools for functional genomics, yielding intracellular information down to the level of single-cells for thousands of genotypic conditions. However, accessing their data requires specialized knowledge and most often that data is no longer analyzed after initial publication. We describe Mineotaur ( http://www.mineotaur.org ), a open-source, downloadable web application that allows easy online sharing and interactive visualisation of large screen datasets, facilitating their dissemination and further analysis, and enhancing their impact. PMID:26679168

  3. Web-based analytical tools for the exploration of spatial data

    NASA Astrophysics Data System (ADS)

    Anselin, Luc; Kim, Yong Wook; Syabri, Ibnu

    This paper deals with the extension of internet-based geographic information systems with functionality for exploratory spatial data analysis (esda). The specific focus is on methods to identify and visualize outliers in maps for rates or proportions. Three sets of methods are included: extreme value maps, smoothed rate maps and the Moran scatterplot. The implementation is carried out by means of a collection of Java classes to extend the Geotools open source mapping software toolkit. The web based spatial analysis tools are illustrated with applications to the study of homicide rates and cancer rates in U.S. counties.

  4. ARM Climate Research Facility: Outreach Tools and Strategies

    NASA Astrophysics Data System (ADS)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  5. Visualising the past: potential applications of Geospatial tools to paleoclimate research

    NASA Astrophysics Data System (ADS)

    Cook, A.; Turney, C. S.

    2012-12-01

    Recent advances in geospatial data acquisition, analysis and web-based data sharing offer new possibilities for understanding and visualising past modes of change. The availability, accessibility and cost-effectiveness of data is better than ever. Researchers can access remotely sensed data including terrain models; use secondary data from large consolidated repositories; make more accurate field measurements and combine data from disparate sources to form a single asset. An increase in the quantity and consistency of data is coupled with subtle yet significant improvements to the way in which geospatial systems manage data interoperability, topological and textual integrity, resulting in more stable analytical and modelling environments. Essentially, researchers now have greater control and more confidence in analytical tools and outputs. Web-based data sharing is growing rapidly, enabling researchers to publish and consume data directly into their spatial systems through OGC-compliant Web Map Services (WMS), Web Feature Services (WFS) and Web Coverage Services (WCS). This has been implemented at institutional, organisational and project scale around the globe. Some institutions have gone one step further and established Spatial Data Infrastructures (SDI) based on Federated Data Structures where the participating data owners retain control over who has access to what. It is important that advances in knowledge are transferred to audiences outside the scientific community in a way that is interesting and meaningful. The visualisation of paleodata through multi-media offers significant opportunities to highlight the parallels and distinctions between past climate dynamics and the challenges of today and tomorrow. Here we present an assessment of key innovations that demonstrate how Geospatial tools can be applied to palaeo-research and used to communicate the results to a diverse array of audiences in the digital age.

  6. An analytical method on the surface residual stress for the cutting tool orientation

    NASA Astrophysics Data System (ADS)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2009-12-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  7. An analytical method on the surface residual stress for the cutting tool orientation

    NASA Astrophysics Data System (ADS)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2010-03-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  8. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. PMID:21412782

  9. Tools for the Quantitative Analysis of Sedimentation Boundaries Detected by Fluorescence Optical Analytical Ultracentrifugation

    PubMed Central

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H.; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system. PMID:24204779

  10. NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Y. R.; Rasbury, J.; Johnson, J.; Barsten, K.; Saile, L.; Watkins, S. D.

    2011-01-01

    In preparation for exploration-class missions, the Exploration Medical Capability (ExMC) element of NASA's Human Research Program (HRP) has compiled a large evidence base, which previously was available only to persons within the NASA community. The evidence base is comprised of several types of data, for example: information on more than 80 medical conditions which could occur during space flight, derived from several sources (including data on incidence and potential outcomes of these medical conditions, as captured in the Integrated Medical Model's Clinical Finding Forms). In addition, approximately 35 gap reports are included in the evidence base, identifying current understanding of the medical challenges for exploration, as well as any gaps in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions. In an effort to make the ExMC information available to the general public and increase collaboration with subject matter experts within and outside of NASA, ExMC has developed an online collaboration tool, very similar to a wiki, titled the NASA Human Research Wiki. The platform chosen for this data sharing, and the potential collaboration it could generate, is a MediaWiki-based application that would house the evidence, allow "read only" access to all visitors to the website, and editorial access to credentialed subject matter experts who have been approved by the Wiki's editorial board. Although traditional wikis allow users to edit information in real time, the NASA Human Research Wiki includes a peer review process to ensure quality and validity of information. The wiki is also intended to be a pathfinder project for other HRP elements that may want to use this type of web-based tool. The wiki website will be released with a subset of the data described and will continue to be populated throughout the year.

  11. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  12. Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools

    ERIC Educational Resources Information Center

    García, Olga Arranz; Secades, Vidal Alonso

    2013-01-01

    In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…

  13. Critical Race Theory and Interest Convergence as Analytic Tools in Teacher Education Policies and Practices

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV

    2008-01-01

    In "The Report of the AERA Panel on Research and Teacher Education," Cochran-Smith and Zeichner's (2005) review of studies in the field of teacher education revealed that many studies lacked theoretical and conceptual grounding. The author argues that Derrick Bell's (1980) interest convergence, a principle of critical race theory, can be used as…

  14. Exploring positioning as an analytical tool for understanding becoming mathematics teachers' identities

    NASA Astrophysics Data System (ADS)

    Skog, Kicki; Andersson, Annica

    2015-03-01

    The aim of this article is to explore how a sociopolitical analysis can contribute to a deeper understanding of critical aspects for becoming primary mathematics teachers' identities during teacher education. The question we ask is the following: How may power relations in university settings affect becoming mathematics teachers' subject positioning? We elaborate on the elusive and interrelated concepts of identity, positioning and power, seen as dynamic and changeable. As these concepts represent three interconnected parts of research analysis in an on-going larger project data from different sources will be used in this illustration. In this paper, we clarify the theoretical stance, ground the concepts historically and strive to connect them to research analysis. In this way, we show that power relations and subject positioning in social settings are critical aspects and need to be taken seriously into account if we aim at understanding becoming teachers' identities.

  15. Interactive Publication: The document as a research tool

    PubMed Central

    Thoma, George R.; Ford, Glenn; Antani, Sameer; Demner-Fushman, Dina; Chung, Michael; Simpson, Matthew

    2010-01-01

    The increasing prevalence of multimedia and research data generated by scientific work affords an opportunity to reformulate the idea of a scientific article from the traditional static document, or even one with links to supplemental material in remote databases, to a self-contained, multimedia-rich interactive publication. This paper describes our concept of such a document, and the design of tools for authoring (Forge) and visualization/analysis (Panorama). They are platform-independent applications written in Java, and developed in Eclipse1 using its Rich Client Platform (RCP) framework. Both applications operate on PDF files with links to XML files that define the media type, location, and action to be performed. We also briefly cite the challenges posed by the potentially large size of interactive publications, the need for evaluating their value to improved comprehension and learning, and the need for their long-term preservation by the National Library of Medicine and other libraries. PMID:20657757

  16. Interactive Publication: The document as a research tool.

    PubMed

    Thoma, George R; Ford, Glenn; Antani, Sameer; Demner-Fushman, Dina; Chung, Michael; Simpson, Matthew

    2010-07-01

    The increasing prevalence of multimedia and research data generated by scientific work affords an opportunity to reformulate the idea of a scientific article from the traditional static document, or even one with links to supplemental material in remote databases, to a self-contained, multimedia-rich interactive publication. This paper describes our concept of such a document, and the design of tools for authoring (Forge) and visualization/analysis (Panorama). They are platform-independent applications written in Java, and developed in Eclipse using its Rich Client Platform (RCP) framework. Both applications operate on PDF files with links to XML files that define the media type, location, and action to be performed. We also briefly cite the challenges posed by the potentially large size of interactive publications, the need for evaluating their value to improved comprehension and learning, and the need for their long-term preservation by the National Library of Medicine and other libraries. PMID:20657757

  17. Conservation of Mass: An Important Tool in Renal Research.

    PubMed

    Sargent, John A

    2016-05-01

    The dialytic treatment of end-stage renal disease (ESRD) patients is based on control of solute concentrations and management of fluid volume. The application of the principal of conservation of mass, or mass balance, is fundamental to the study of such treatment and can be extended to chronic kidney disease (CKD) in general. This review discusses the development and use of mass conservation and transport concepts, incorporated into mathematical models. These concepts, which can be applied to a wide range of solutes of interest, represent a powerful tool for quantitatively guided studies of dialysis issues currently and into the future. Incorporating these quantitative concepts in future investigations is key to achieving positive control of known solutes, and in the analysis of such studies; to relate future research to known results of prior studies; and to help in the understanding of the obligatory physiological perturbations that result from dialysis therapy. PMID:26278776

  18. Comprehensive analytical strategy for biomarker identification based on liquid chromatography coupled to mass spectrometry and new candidate confirmation tools.

    PubMed

    Mohamed, Rayane; Varesio, Emmanuel; Ivosev, Gordana; Burton, Lyle; Bonner, Ron; Hopfgartner, Gérard

    2009-09-15

    A comprehensive analytical LC-MS(/MS) platform for low weight biomarkers molecule in biological fluids is described. Two complementary retention mechanisms were used in HPLC by optimizing the chromatographic conditions for a reversed-phase column and a hydrophilic interaction chromatography column. LC separation was coupled to mass spectrometry by using an electrospray ionization operating in positive polarity mode. This strategy enables us to correctly retain and separate hydrophobic as well as polar analytes. For that purpose artificial model study samples were generated with a mixture of 38 well characterized compounds likely to be present in biofluids. The set of compounds was used as a standard aqueous mixture or was spiked into urine at different concentration levels to investigate the capability of the LC-MS(/MS) platform to detect variations across biological samples. Unsupervised data analysis by principal component analysis was performed and followed by principal component variable grouping to find correlated variables. This tool allows us to distinguish three main groups whose variables belong to (a) background ions (found in all type of samples), (b) ions distinguishing urine samples from aqueous standard and blank samples, (c) ions related to the spiked compounds. Interpretation of these groups allows us to identify and eliminate isotopes, adducts, fragments, etc. and to generate a reduced list of m/z candidates. This list is then submitted to the prototype MZSearcher software tool which simultaneously searches several lists of potential metabolites extracted from metabolomics databases (e.g., KEGG, HMDB, etc) to propose biomarker candidates. Structural confirmation of these candidates was done off-line by fraction collection followed by nanoelectrospray infusion to provide high quality MS/MS data for spectral database queries. PMID:19702294

  19. Application of metabonomic analytical techniques in the modernization and toxicology research of traditional Chinese medicine

    PubMed Central

    Lao, Yong-Min; Jiang, Jian-Guo; Yan, Lu

    2009-01-01

    In the recent years, a wide range of metabonomic analytical techniques are widely used in the modern research of traditional Chinese medicine (TCM). At the same time, the international community has attached increasing importance to TCM toxicity problems. Thus, many studies have been implemented to investigate the toxicity mechanisms of TCM. Among these studies, many metabonomic-based methods have been implemented to facilitate TCM toxicity investigation. At present, the most prevailing methods for TCM toxicity research are mainly single analysis techniques using only one analytical means. These techniques include nuclear magnetic resonance (NMR), gas chromatography-mass spectrometry (GC-MS), and liquid chromatography-mass spectrometry (LC-MS), etc.; with these techniques, some favourable outcomes have been gained in the toxic reaction studies of TCM, such as the action target organs assay, the establishment of action pattern, the elucidation of action mechanism and the exploration of action material foundation. However, every analytical technique has its advantages and drawbacks, no existing analytical technique can be versatile. Multi-analysed techniques can partially overcome the shortcomings of single-analysed techniques. Combination of GC-MS and LC-MS metabolic profiling approaches has unravelled the pathological outcomes of aristolochic acid-induced nephrotoxicity, which can not be achieved by single-analysed techniques. It is believed that with the further development of metabonomic analytical techniques, especially multi-analysed techniques, metabonomics will greatly promote TCM toxicity research and be beneficial to the modernization of TCM in terms of extending the application of modern means in the TCM safety assessment, assisting the formulation of TCM safety norms and establishing the international standards indicators. PMID:19508399

  20. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  1. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  2. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    PubMed

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  3. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance. PMID:26110404

  4. Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy

    ERIC Educational Resources Information Center

    van Eijck, Michiel; Roth, Wolff-Michael

    2007-01-01

    Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to…

  5. The GATO gene annotation tool for research laboratories.

    PubMed

    Fujita, A; Massirer, K B; Durham, A M; Ferreira, C E; Sogayar, M C

    2005-11-01

    Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB. PMID:16258624

  6. Proteomic analysis of synovial fluid as an analytical tool to detect candidate biomarkers for knee osteoarthritis

    PubMed Central

    Liao, Weixiong; Li, Zhongli; Zhang, Hao; Li, Ji; Wang, Ketao; Yang, Yimeng

    2015-01-01

    We conducted research to detect the proteomic profiles in synovial fluid (SF) from knee osteoarthritis (OA) patients to better understand the pathogenesis and aetiology of OA. Our long-term goal is to identify reliable candidate biomarkers for OA in SF. The SF proteins obtained from 10 knee OA patients and 10 non-OA patients (9 of whom were patients with a meniscus injury in the knee; 1 had a discoid meniscus in the knee, and all exhibited intact articular cartilage) were separated by two-dimensional electrophoresis (2-DE). The repeatability of the obtained protein spots regarding their intensity was tested via triplicate 2-DE of selected samples. The observed protein expression patterns were subjected to statistical analysis, and differentially expressed protein spots were identified via matrix-assisted laser desorption/ionisation-time of flight/time of flight mass spectrometry (MALDI-TOF/TOF MS). Our analyses showed low intrasample variability and clear intersample variation. Among the protein spots observed on the gels, there were 29 significant differences, of which 22 corresponded to upregulation and 7 to downregulation in the OA group. One of the upregulated protein spots was confirmed to be haptoglobin by mass spectrometry, and the levels of haptoglobin in SF are positively correlated with the severity of OA (r = 0.89, P < 0.001). This study showed that 2-DE could be used under standard conditions to screen SF samples and identify a small subset of proteins in SF that are potential markers associated with OA. Spots of interest identified by mass spectrometry, such as haptoglobin, may be associated with OA severity. PMID:26617706

  7. High-resolution entrainment mapping of gastric pacing: a new analytical tool.

    PubMed

    O'Grady, Gregory; Du, Peng; Lammers, Wim J E P; Egbuji, John U; Mithraratne, Pulasthi; Chen, Jiande D Z; Cheng, Leo K; Windsor, John A; Pullan, Andrew J

    2010-02-01

    Gastric pacing has been investigated as a potential treatment for gastroparesis. New pacing protocols are required to improve symptom and motility outcomes; however, research progress has been constrained by a limited understanding of the effects of electrical stimulation on slow-wave activity. This study introduces high-resolution (HR) "entrainment mapping" for the analysis of gastric pacing and presents four demonstrations. Gastric pacing was initiated in a porcine model (typical amplitude 4 mA, pulse width 400 ms, period 17 s). Entrainment mapping was performed using flexible multielectrode arrays (

  8. Informetric Theories and Methods for Exploring the Internet: An Analytical Survey of Recent Research Literature.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit; Peritz, Bluma C.

    2002-01-01

    Presents a selective review of research based on the Internet, using bibliometric and informetric methods and tools. Highlights include data collection methods on the Internet, including surveys, logging, and search engines; and informetric analysis, including citation analysis and content analysis. (Contains 78 references.) (Author/LRW)

  9. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  10. Stem diameter variations as a versatile research tool in ecophysiology.

    PubMed

    De Swaef, Tom; De Schepper, Veerle; Vandegehuchte, Maurits W; Steppe, Kathy

    2015-10-01

    High-resolution stem diameter variations (SDV) are widely recognized as a useful drought stress indicator and have therefore been used in many irrigation scheduling studies. More recently, SDV have been used in combination with other plant measurements and biophysical modelling to study fundamental mechanisms underlying whole-plant functioning and growth. The present review aims to scrutinize the important insights emerging from these more recent SDV applications to identify trends in ongoing fundamental research. The main mechanism underlying SDV is variation in water content in stem tissues, originating from reversible shrinkage and swelling of dead and living tissues, and irreversible growth. The contribution of different stem tissues to the overall SDV signal is currently under debate and shows variation with species and plant age, but can be investigated by combining SDV with state-of-the-art technology like magnetic resonance imaging. Various physiological mechanisms, such as water and carbon transport, and mechanical properties influence the SDV pattern, making it an extensive source of information on dynamic plant behaviour. To unravel these dynamics and to extract information on plant physiology or plant biophysics from SDV, mechanistic modelling has proved to be valuable. Biophysical models integrate different mechanisms underlying SDV, and help us to explain the resulting SDV signal. Using an elementary modelling approach, we demonstrate the application of SDV as a tool to examine plant water relations, plant hydraulics, plant carbon relations, plant nutrition, freezing effects, plant phenology and dendroclimatology. In the ever-expanding SDV knowledge base we identified two principal research tracks. First, in detailed short-term experiments, SDV measurements are combined with other plant measurements and modelling to discover patterns in phloem turgor, phloem osmotic concentrations, root pressure and plant endogenous control. Second, long-term SDV time

  11. RTNS-II - a fusion materials research tool

    NASA Astrophysics Data System (ADS)

    Logan, C. M.; Heikkinen, D. W.

    1982-09-01

    Rotating Target Neutron Source-II (RTNS-II) is a national facility for fusion materials research. It contains two 14 MeV neutron sources. Deuterons are accelerated to ˜ 400 keV and transported to a rotating titanium tritide target. Present source strength is greater than 1 × 10 13 n/s and source diameter is 1 cm fwhm. An air-levitated vacuum seal permits rotation of the target at 5000 rpm with negligible impact on accelerator vacuum system gas load. Targets are cooled by chilled water flowing through internal channels in a copper alloy substrate. Substrates are produced by solid-state diffusion bonding of two sheets, one containing etched cooling channels. An electroforming process is being developed which will reduce substrate cost and improve reliability. Titanium tritide coating thickness is ˜ 10 μm giving an initial tritium inventory for the present 23 cm diameter targets of 3.7 × 10 7 MBq. Operating interval between target changes is typically about 80 h. Thirteen laboratories and universities have participated in the experimental program at RTNS-II. Most measurements have been directed at understanding defect production and low-dose damage microstructure. The principal diagnostic tools have been cryogenic resistivity measurements, mechanical properties assessment and transmission electron microscopy. Some engineering tests have been conducted in support of near-term magnetic confinement experiments and of reactor materials which will see small lifetime doses.

  12. Microgravity as a research tool to improve US agriculture

    NASA Astrophysics Data System (ADS)

    Bula, R. J.; Stankovic, Bratislav

    2000-01-01

    Crop production and utilization are undergoing significant modifications and improvements that emanate from adaptation of recently developed plant biotechnologies. Several innovative technologies will impact US agriculture in the next century. One of these is the transfer of desirable genes from organisms to economically important crop species in a way that cannot be accomplished with traditional plant breeding techniques. Such plant genetic engineering offers opportunities to improve crop species for a number of characteristics as well as use as source materials for specific medical and industrial applications. Although plant genetic engineering is having an impact on development of new crop cultivars, several major constraints limit the application of this technology to selected crop species and genotypes. Consequently, gene transfer systems that overcome these constraints would greatly enhance development of new crop materials. If results of a recent gene transfer experiment conducted in microgravity during a Space Shuttle mission are confirmed, and with the availability of the International Space Station as a permanent space facility, commercial plant transformation activity in microgravity could become a new research tool to improve US agriculture. .

  13. The capsicum transcriptome DB: a "hot" tool for genomic research.

    PubMed

    Góngora-Castillo, Elsa; Fajardo-Jaime, Rubén; Fernández-Cortes, Araceli; Jofre-Garfias, Alba E; Lozoya-Gloria, Edmundo; Martínez, Octavio; Ochoa-Alejo, Neftalí; Rivera-Bustamante, Rafael

    2012-01-01

    Chili pepper (Capsicum annuum) is an economically important crop with no available public genome sequence. We describe a genomic resource to facilitate Capsicum annuum research. A collection of Expressed Sequence Tags (ESTs) derived from five C. annuum organs (root, stem, leaf, flower and fruit) were sequenced using the Sanger method and multiple leaf transcriptomes were deeply sampled using with GS-pyrosequencing. A hybrid assembly of 1,324,516 raw reads yielded 32,314 high quality contigs as validated by coverage and identity analysis with existing pepper sequences. Overall, 75.5% of the contigs had significant sequence similarity to entries in nucleic acid and protein databases; 23% of the sequences have not been previously reported for C. annuum and expand sequence resources for this species. A MySQL database and a user-friendly Web interface were constructed with search-tools that permit queries of the ESTs including sequence, functional annotation, Gene Ontology classification, metabolic pathways, and assembly information. The Capsicum Transcriptome DB is free available from http://www.bioingenios.ira.cinvestav.mx:81/Joomla/ PMID:22359434

  14. Analytical Ultracentrifugation and Its Role in Development and Research of Therapeutical Proteins.

    PubMed

    Liu, Jun; Yadav, Sandeep; Andya, James; Demeule, Barthélemy; Shire, Steven J

    2015-01-01

    The historical contributions of analytical ultracentrifugation (AUC) to modern biology and biotechnology drug development and research are discussed. AUC developed by Svedberg was used to show that proteins are actually large defined molecular entities and also provided the first experimental verification for the semiconservative replication model for DNA initially proposed by Watson and Crick. This chapter reviews the use of AUC to investigate molecular weight of recombinant-DNA-produced proteins, complex formation of antibodies, intermolecular interactions in dilute and high concentration protein solution, and their impact on physical properties such as solution viscosity. Recent studies using a "competitive binding" analysis by AUC have been useful in critically evaluating the design and interpretation of surface plasmon resonance measurements and are discussed. The future of this technology is also discussed including prospects for a new higher precision analytical ultracentrifuge. PMID:26412663

  15. Concept Mapping as a Research Tool to Evaluate Conceptual Change Related to Instructional Methods

    ERIC Educational Resources Information Center

    Miller, Kevin J.; Koury, Kevin A.; Fitzgerald, Gail E.; Hollingsead, Candice; Mitchem, Katherine J.; Tsai, Hui-Hsien; Park, Meeaeng Ko

    2009-01-01

    Concept maps are commonly used in a variety of educational settings as a learning aid or instructional tool. Additionally, their potential as a research tool has been recognized. This article defines features of concept maps, describes the use of pre- and postconcept maps as a research tool, and offers a protocol for employing concept maps as an…

  16. Analytic element ground water modeling as a research program (1980 to 2006).

    PubMed

    Kraemer, Stephen R

    2007-01-01

    Scientists and engineers who use the analytic element method (AEM) for solving problems of regional ground water flow may be considered a community, and this community can be studied from the perspective of history and philosophy of science. Applying the methods of the Hungarian philosopher of science Imre Lakatos (1922 to 1974), the AEM "research program" is distinguished by its hard core (theoretical basis), protective belt (auxiliary assumptions), and heuristic (problem solving machinery). AEM has emerged relatively recently in the scientific literature and has a relatively modest number of developers and practitioners compared to the more established finite-element and finite-difference methods. Nonetheless, there is evidence to support the assertion that the AEM research program remains in a progressive phase. The evidence includes an expanding publication record, a growing research strand following Professor Otto Strack's book Groundwater Mechanics (1989), the continued placement of AEM researchers in academia, and the further development of innovative analytical solutions and computational solvers/models. PMID:17600570

  17. Tools for Linking Research and Practice in the Helping Professions: Research Abstract Worksheets and Personal Reviews of the Literature.

    ERIC Educational Resources Information Center

    Burlingame, Martin

    This document is comprised of four chapters that show how to use research-abstract worksheets and personal reviews of the literature as tools for linking research and practice in the helping professions. The research tools help to condense lengthy reports, place them into a consistent format, and actively involve the information seeker. Chapter 1…

  18. Researcher Effects on Mortality Salience Research: A Meta-Analytic Moderator Analysis

    ERIC Educational Resources Information Center

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-01-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = 0.35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing…

  19. Concept Maps as a Research and Evaluation Tool To Assess Conceptual Change in Quantum Physics.

    ERIC Educational Resources Information Center

    Sen, Ahmet Ilhan

    2002-01-01

    Informs teachers about using concept maps as a learning tool and alternative assessment tools in education. Presents research results of how students might use concept maps to communicate their cognitive structure. (Author/KHR)

  20. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    SciTech Connect

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  1. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research. PMID:26741727

  2. Research subjects for analytical estimation of core degradation at Fukushima-Daiichi nuclear power plant

    SciTech Connect

    Nagase, F.; Ishikawa, J.; Kurata, M.; Yoshida, H.; Kaji, Y.; Shibamoto, Y.; Amaya, M; Okumura, K.; Katsuyama, J.

    2013-07-01

    Estimation of the accident progress and status inside the pressure vessels (RPV) and primary containment vessels (PCV) is required for appropriate conductance of decommissioning in the Fukushima-Daiichi NPP. For that, it is necessary to obtain additional experimental data and revised models for the estimation using computer codes with increased accuracies. The Japan Atomic Energy Agency (JAEA) has selected phenomena to be reviewed and developed, considering previously obtained information, conditions specific to the Fukushima-Daiichi NPP accident, and recent progress of experimental and analytical technologies. As a result, research and development items have been picked up in terms of thermal-hydraulic behavior in the RPV and PCV, progression of fuel bundle degradation, failure of the lower head of RPV, and analysis of the accident. This paper introduces the selected phenomena to be reviewed and developed, research plans and recent results from the JAEA's corresponding research programs. (authors)

  3. Dynamic Visual Acuity: a Functionally Relevant Research Tool

    NASA Technical Reports Server (NTRS)

    Peters, Brian T.; Brady, Rachel A.; Miller, Chris A.; Mulavara, Ajitkumar P.; Wood, Scott J.; Cohen, Helen S.; Bloomberg, Jacob J.

    2010-01-01

    Coordinated movements between the eyes and head are required to maintain a stable retinal image during head and body motion. The vestibulo-ocular reflex (VOR) plays a significant role in this gaze control system that functions well for most daily activities. However, certain environmental conditions or interruptions in normal VOR function can lead to inadequate ocular compensation, resulting in oscillopsia, or blurred vision. It is therefore possible to use acuity to determine when the environmental conditions, VOR function, or the combination of the two is not conductive for maintaining clear vision. Over several years we have designed and tested several tests of dynamic visual acuity (DVA). Early tests used the difference between standing and walking acuity to assess decrements in the gaze stabilization system after spaceflight. Supporting ground-based studies measured the responses from patients with bilateral vestibular dysfunction and explored the effects of visual target viewing distance and gait cycle events on walking acuity. Results from these studies show that DVA is affected by spaceflight, is degraded in patients with vestibular dysfunction, changes with target distance, and is not consistent across the gait cycle. We have recently expanded our research to include studies in which seated subjects are translated or rotated passively. Preliminary results from this work indicate that gaze stabilization ability may differ between similar active and passive conditions, may change with age, and can be affected by the location of the visual target with respect to the axis of motion. Use of DVA as a diagnostic tool is becoming more popular but the functional nature of the acuity outcome measure also makes it ideal for identifying conditions that could lead to degraded vision. By doing so, steps can be taken to alter the problematic environments to improve the man-machine interface and optimize performance.

  4. Applied Analytical Combustion/emissions Research at the NASA Lewis Research Center - a Progress Report

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  5. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  6. Giving Raw Data a Chance to Talk: A Demonstration of Exploratory Visual Analytics with a Pediatric Research Database Using Microsoft Live Labs Pivot to Promote Cohort Discovery, Research, and Quality Assessment

    PubMed Central

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V. Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses. PMID:24808811

  7. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  8. Empirical-Analytical Methodological Research in Environmental Education: Response To a Negative Trend in Methodological and Ideological Discussions.

    ERIC Educational Resources Information Center

    Connell, Sharon

    1997-01-01

    Explores the current status of the empirical-analytical methodology and its "positivist" ideologies in environmental education research through the critical analysis of three criticisms outlined in an article by Robottom and Hart. Suggests that the criticisms misrepresent empirical-analytical methodology in their dismissal of it as behaviorist…

  9. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  10. Computer as Research Tools 4.Use Your PC More Effectively

    NASA Astrophysics Data System (ADS)

    Baba, Hajime

    This article shows the useful tools on personal computers. The electronical dictionaries, the full-text search system, the simple usage of the preprint server, and the numeric computation language for applications in engineering and science are introduced.

  11. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  12. MEETING TODAY'S EMERGING CONTAMINANTS WITH TOMORROW'S RESEARCH TOOL

    EPA Science Inventory

    This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park).

  13. Research and learning opportunities in a reactor-based nuclear analytical laboratory

    SciTech Connect

    Robinson, L. . Chemical and Analytical Sciences Div.); Brown, D.H. )

    1994-10-01

    Although considered by many to be a mature science, neutron activation analysis (NAA) continues to be a valuable tool in trace-element research applications. Examples of the applicability of NAA can be found in a variety of areas including archaeology, environmental science, epidemiology, forensic science, and material science to name a few. Each stage of NAA provides opportunities to share numerous practical and fundamental scientific principles with high school teachers and students. This paper will present an overview of these opportunities and give a specific example from collaboration with a high school teacher whose research involved the automation of a gamma-ray spectroscopy counting system using a laboratory robot.

  14. Analytical combustion/emissions research related to the NASA high-speed research program

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1991-01-01

    Increasing the pressure and temperature of the engines of new generation supersonic airliners increases the emissions of nitrogen oxides to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of implementing low emissions combustor technologies, NASA Lewis Research Center has pursued a combustion analysis program to guide combustor design processes, to identify potential concepts of greatest promise, and to optimize them at low cost, with short turn-around time. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts have been made to improve the code capabilities of modeling the physics. Test cases and experiments are used for code validation. To provide insight into the combustion process and combustor design, two-dimensional and three-dimensional codes such as KIVA-II and LeRC 3D have been used. These codes are operational and calculations have been performed to guide low emissions combustion experiments.

  15. Big data, advanced analytics and the future of comparative effectiveness research.

    PubMed

    Berger, Marc L; Doban, Vitalii

    2014-03-01

    The intense competition that accompanied the growth of internet-based companies ushered in the era of 'big data' characterized by major innovations in processing of very large amounts of data and the application of advanced analytics including data mining and machine learning. Healthcare is on the cusp of its own era of big data, catalyzed by the changing regulatory and competitive environments, fueled by growing adoption of electronic health records, as well as efforts to integrate medical claims, electronic health records and other novel data sources. Applying the lessons from big data pioneers will require healthcare and life science organizations to make investments in new hardware and software, as well as in individuals with different skills. For life science companies, this will impact the entire pharmaceutical value chain from early research to postcommercialization support. More generally, this will revolutionize comparative effectiveness research. PMID:24645690

  16. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  17. Justice at the millennium: a meta-analytic review of 25 years of organizational justice research.

    PubMed

    Colquitt, J A; Conlon, D E; Wesson, M J; Porter, C O; Ng, K Y

    2001-06-01

    The field of organizational justice continues to be marked by several important research questions, including the size of relationships among justice dimensions, the relative importance of different justice criteria, and the unique effects of justice dimensions on key outcomes. To address such questions, the authors conducted a meta-analytic review of 183 justice studies. The results suggest that although different justice dimensions are moderately to highly related, they contribute incremental variance explained in fairness perceptions. The results also illustrate the overall and unique relationships among distributive, procedural, interpersonal, and informational justice and several organizational outcomes (e.g., job satisfaction, organizational commitment, evaluation of authority, organizational citizenship behavior, withdrawal, performance). These findings are reviewed in terms of their implications for future research on organizational justice. PMID:11419803

  18. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    NASA Astrophysics Data System (ADS)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  19. Accelerator mass spectrometry as a bioanalytical tool for nutritional research

    SciTech Connect

    Vogel, J.S.; Turteltaub, K.W.

    1997-09-01

    Accelerator Mass Spectrometry is a mass spectrometric method of detecting long-lived radioisotopes without regard to their decay products or half-life. The technique is normally applied to geochronology, but recently has been developed for bioanalytical tracing. AMS detects isotope concentrations to parts per quadrillion, quantifying labeled biochemicals to attomole levels in milligram- sized samples. Its advantages over non-isotopeic and stable isotope labeling methods are reviewed and examples of analytical integrity, sensitivity, specificity, and applicability are provided.

  20. "This Ain't the Projects": A Researcher's Reflections on the Local Appropriateness of Our Research Tools

    ERIC Educational Resources Information Center

    Martinez, Danny C.

    2016-01-01

    In this article I examine the ways in which Black and Latina/o urban high school youth pressed me to reflexively examine my positionality and that of my research tools during a year-long ethnographic study documenting their communicative repertoires. I reflect on youth comments on my researcher tools, as well as myself, in order to wrestle with…

  1. [Research on infrared safety protection system for machine tool].

    PubMed

    Zhang, Shuan-Ji; Zhang, Zhi-Ling; Yan, Hui-Ying; Wang, Song-De

    2008-04-01

    In order to ensure personal safety and prevent injury accident in machine tool operation, an infrared machine tool safety system was designed with infrared transmitting-receiving module, memory self-locked relay and voice recording-playing module. When the operator does not enter the danger area, the system has no response. Once the operator's whole or part of body enters the danger area and shades the infrared beam, the system will alarm and output an control signal to the machine tool executive element, and at the same time, the system makes the machine tool emergency stop to prevent equipment damaged and person injured. The system has a module framework, and has many advantages including safety, reliability, common use, circuit simplicity, maintenance convenience, low power consumption, low costs, working stability, easy debugging, vibration resistance and interference resistance. It is suitable for being installed and used in different machine tools such as punch machine, pour plastic machine, digital control machine, armor plate cutting machine, pipe bending machine, oil pressure machine etc. PMID:18619302

  2. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  3. 'Model' or 'tool'? New definitions for translational research.

    PubMed

    Sive, Hazel

    2011-03-01

    The term 'model' often describes non-human biological systems that are used to obtain a better understanding of human disorders. According to the most stringent definition, an animal 'model' would display exactly the same phenotype as seen in the relevant human disorder; however, this precise correspondence is often not present. In this Editorial, I propose the alternative, broader term 'tool' to describe a biological system that does not obviously (or precisely) recapitulate a human disorder, but that nonetheless provides useful insight into the etiology or treatment of that disorder. Applying the term 'tool' to biological systems used in disease-related studies will help to identify those systems that can most effectively address mechanisms underlying human disease. Conversely, differentiating 'models' from 'tools' will help to define more clearly the limitations of biological systems used in preclinical analyses. PMID:21357758

  4. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  5. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  6. Typology of Analytical Errors in Qualitative Educational Research: An Analysis of the 2003-2007 Education Science Dissertations in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    In this research, the level of quality of the qualitative research design used and the analytic mistakes made in the doctorate dissertations carried out in the field of education science in Turkey have been tried to be identified. Case study design has been applied in the study in which qualitative research techniques have been used. The universe…

  7. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  8. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer

  9. Specially Made for Science: Researchers Develop Online Tools For Collaborations

    ERIC Educational Resources Information Center

    Guterman, Lila

    2008-01-01

    Blogs, wikis, and social-networking sites such as Facebook may get media buzz these days, but for scientists, engineers, and doctors, they are not even on the radar. The most effective tools of the Internet for such people tend to be efforts more narrowly aimed at their needs, such as software that helps geneticists replicate one another's…

  10. Exploiting the Brachypodium Tool Box in cereal and grass research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    It is now a decade since Brachypodium distachyon was suggested as a model species for temperate grasses and cereals. Since then transformation protocols, large expressed sequence tag (EST) populations, tools for forward and reverse genetic screens, highly refined cytogenetic probes, germplasm coll...

  11. Preservice Teachers as Researchers: Using Ethnographic Tools To Interpret Practice.

    ERIC Educational Resources Information Center

    Christensen, Lois McFadyen

    The structures of meaning preservice teachers perceived and interpreted as a result of field placements in a methods course and through the use of ethnographic tools were studied in an ethnographic design. The study involved 11 preservice teachers. It described how they shaped each other's thinking about teaching and it examined how ethnographic…

  12. PARAMO: A Parallel Predictive Modeling Platform for Healthcare Analytic Research using Electronic Health Records

    PubMed Central

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng

    2014-01-01

    Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate

  13. Tools for Monitoring Social Media: A Marketing Research Project

    ERIC Educational Resources Information Center

    Veeck, Ann; Hoger, Beth

    2014-01-01

    Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…

  14. "Mythbusters": A Tool for Teaching Research Methods in Psychology

    ERIC Educational Resources Information Center

    Burkley, Edward; Burkley, Melissa

    2009-01-01

    "Mythbusters" uses multiple research methods to test interesting topics, offering research methods students an entertaining review of course material. To test the effectiveness of "Mythbusters" clips in a psychology research methods course, we systematically selected and showed 4 clips. Students answered questions about the clips, offered their…

  15. Analytical Validation of AmpliChip p53 Research Test for Archival Human Ovarian FFPE Sections.

    PubMed

    Marton, Matthew J; McNamara, Andrew R; Nikoloff, D Michele; Nakao, Aki; Cheng, Jonathan

    2015-01-01

    The p53 tumor suppressor gene (TP53) is reported to be mutated in nearly half of all tumors and plays a central role in genome integrity. Detection of mutations in p53 can be accomplished by many assays, including the AmpliChip p53 Research Test. The AmpliChip p53 Research Test has been successfully used to determine p53 status in hematologic malignancies and fresh frozen solid tissues but there are few reports of using the assay with formalin fixed, paraffin-embedded (FFPE) tissue. The objective of this study was to describe analytical performance characterization of the AmpliChip p53 Research Test to detect p53 mutations in genomic DNA isolated from archival FFPE human ovarian tumor tissues. Method correlation with sequencing showed 96% mutation-wise agreement and 99% chip-wise agreement. We furthermore observed 100% agreement (113/113) of the most prevalent TP53 mutations. Workflow reproducibility was 96.8% across 8 samples, with 2 operators, 2 reagent lots and 2 instruments. Section-to-section reproducibility was 100% for each sample across a 60 μm region of the FFPE block from ovarian tumors. These data indicate that the AmpliChip p53 Research Test is an accurate and reproducible method for detecting mutations in TP53 from archival FFPE human ovarian specimens. PMID:26125596

  16. Mixed frequency-/time-domain coherent multidimensional spectroscopy: research tool or potential analytical method?

    PubMed

    Pakoulev, Andrei V; Rickard, Mark A; Kornau, Kathryn M; Mathew, Nathan A; Yurs, Lena A; Block, Stephen B; Wright, John C

    2009-09-15

    Coherent multidimensional spectroscopy (CMDS) is now the optical analogue of nuclear magnetic resonance (NMR). Just as NMR heteronuclear multiple-quantum coherence (HMQC) methods rely on multiple quantum coherences, achieving widespread application requires that CMDS also excites multiple quantum coherences over a wide range of quantum state energies. This Account focuses on frequency-domain CMDS because these methods tune the excitation frequencies to resonance with the desired quantum states and can form multiple quantum coherences between states with very different energies. CMDS methods use multiple excitation pulses to excite multiple quantum states within their dephasing time, so their quantum mechanical phase is maintained. Coherences formed from pairs of the excited states emit coherent beams of light. The temporal ordering of the excitation pulses defines a sequence of coherences that can result in zero, single, double, or higher order coherences as required for multiple quantum coherence CMDS. Defining the temporal ordering and the excitation frequencies and spectrally resolving the output frequency also defines a particular temporal pathway for the coherences, just as an NMR pulse sequence defines an NMR method. Two dimensional contour plots through this multidimensional parameter space allow visualization of the state energies and dynamics. This Account uses nickel and rhodium chelates as models for understanding mixed frequency-/time-domain CMDS. Mixed frequency-/time-domain methods use excitation pulse widths that are comparable to the dephasing times, so multidimensional spectra are obtained by scanning the excitation frequencies, while the coherence and population dynamics are obtained by scanning the time delays. Changing the time delays changes the peaks in the 2D excitation spectra depending upon whether the pulse sequence excites zero, single, or double quantum coherences. In addition, peaks split as a result of the frequency-domain manifestation of quantum beating. Similarly, changing the excitation and monochromator frequencies changes the dependence on the excitation delay times depending upon whether the frequencies match the resonances involved in the different time-ordered pathways. Contour plots that change a time delay and frequency visualize the temporal changes of specific spectral features. Frequency-domain methods are resonant with specific states, so the sequence of coherences and populations is defined. Coherence transfer, however, can cause output beams at unexpected frequencies. Coherence transfer occurs when the thermal bath induces a coherence between two states (a and g) to evolve to a new coherence (b and g). Since the two coherences have different frequencies and since there are different time orderings for the occurrence of coherence transfer, the delay time dependence develops modulations that depend on the coherences' frequency difference. Higher order coherences can also be generated by raising the excitation intensities. New features appear in the 2D spectra and dynamic Stark splittings occur. These effects will form the basis for the higher order multiple quantum coherence methods and also provide a method for probing molecular potential energy surfaces. PMID:19445479

  17. Research Tool Patents--Rumours of their Death are Greatly Exaggerated

    ERIC Educational Resources Information Center

    Carroll, Peter G.; Roberts, John S.

    2006-01-01

    Using a patented drug during clinical trials is not infringement [35 U.S.C. 271(e)(1)]. Merck v Integra enlarged this "safe harbour" to accommodate preclinical use of drugs and patented "research tools" if "reasonably related" to FDA approval. The decision allowed lower courts, should they wish, to find any use of a research tool, except for…

  18. Big Data analytics and cognitive computing - future opportunities for astronomical research

    NASA Astrophysics Data System (ADS)

    Garrett, M. A.

    2014-10-01

    The days of the lone astronomer with his optical telescope and photographic plates are long gone: Astronomy in 2025 will not only be multi-wavelength, but multi-messenger, and dominated by huge data sets and matching data rates. Catalogues listing detailed properties of billions of objects will in themselves require a new industrial-scale approach to scientific discovery, requiring the latest techniques of advanced data analytics and an early engagement with the first generation of cognitive computing systems. Astronomers have the opportunity to be early adopters of these new technologies and methodologies - the impact can be profound and highly beneficial to effecting rapid progress in the field. Areas such as SETI research might favourably benefit from cognitive intelligence that does not rely on human bias and preconceptions.

  19. Methodological Challenges in Research on Sexual Risk Behavior: I. Item Content, Scaling, and Data Analytical Options

    PubMed Central

    Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.

    2008-01-01

    Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027

  20. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  1. EXAMPLES OF THE ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  2. Somatic Sensitivity and Reflexivity as Validity Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Green, Jill

    2015-01-01

    Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…

  3. Recent and Potential Application of Engineering Tools to Educational Research.

    ERIC Educational Resources Information Center

    Taft, Martin I.

    This paper presents a summary of some recent engineering research in education and identifies some research areas with high payoff potential. The underlying assumption is that a school is a system with a set of subsystems which is potentially susceptible to analysis, design, and eventually some sort of optimization. This assumption leads to the…

  4. SMART II : the spot market agent research tool version 2.0.

    SciTech Connect

    North, M. J. N.

    2000-12-14

    Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.

  5. Applying Web-Based Tools for Research, Engineering, and Operations

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  6. Education of research ethics for clinical investigators with Moodle tool

    PubMed Central

    2013-01-01

    Background In clinical research scientific, legal as well as ethical aspects are important. It is well known that clinical investigators at university hospitals have to undertake their PhD-studies alongside their daily work and reconciling work and study can be challenging. The aim of this project was to create a web based course in clinical research bioethics (5 credits) and to examine whether the method is suitable for teaching bioethics. The course comprised of six modules: an initial examination (to assess knowledge in bioethics), information on research legislation, obtaining permissions from authorities, writing an essay on research ethics, preparing one’s own study protocol, and a final exam. All assignments were designed with an idea of supporting students to reflect on their learning with their own research. Methods 57 PhD-students (medical, nursing and dental sciences) enrolled and 46 completed the course. Course evaluation was done using a questionnaire. The response rate was 78%. Data were analyzed using quantitative methods and qualitative content analysis. Results The course was viewed as useful and technically easy to perform. Students were pleased with the guidance offered. Personal feedback from teachers about students’ own performance was seen advantageous and helped them to appreciate how these aspects could be applied their own studies. The course was also considered valuable for future research projects. Conclusions Ethical issues and legislation of clinical research can be understood more easily when students can reflect the principles upon their own research project. Web based teaching environment is a feasible learning method for clinical investigators. PMID:24330709

  7. Empirical-Analytical Methodological Research in Environmental Education: Response to a Negative Trend in Methodological and Ideological Discussions

    ERIC Educational Resources Information Center

    Connell, Sharon

    2006-01-01

    The purpose of this paper is to contribute to methodological discourse about research approaches to environmental education. More specifically, the paper explores the current status of the "empirical-analytical methodology" and its "positivist" (traditional- and post-positivist) ideologies, in environmental education research through the critical…

  8. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  9. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  10. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  11. The airborne infrared scanner as a geophysical research tool

    USGS Publications Warehouse

    Friedman, Jules D.

    1970-01-01

    The infrared scanner is proving to be an effective anomaly-mapping tool, albeit one which depicts surface emission directly and heat mass transfer from depths only indirectly and at a threshold level 50 to 100 times the normal conductive heat flow of the earth. Moreover, successive terrain observations are affected by time-dependent variables such as the diurnal and seasonal warming and cooling cycle of a point on the earth's surface. In planning precise air borne surveys of radiant flux from the earth's surface, account must be taken of background noise created by variations in micrometeorological factors and emissivity of surface materials, as well as the diurnal temperature cycle. The effect of the diurnal cycle may be minimized by planning predawn aerial surveys. In fact, the diurnal change is very small for most water bodies and the emissivity factor for water (e) =~ 1 so a minimum background noise is characteristic of scanner records of calm water surfaces.

  12. Intellectual Property: a powerful tool to develop biotech research.

    PubMed

    Giugni, Diego; Giugni, Valter

    2010-09-01

    Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349

  13. Temporal perception in visual processing as a research tool

    PubMed Central

    Zhou, Bin; Zhang, Ting; Mao, Lihua

    2015-01-01

    Accumulated evidence has shown that the subjective time in the sub-second range can be altered by different factors; some are related to stimulus features such as luminance contrast and spatial frequency, others are processes like perceptual grouping and contextual modulation. These findings indicate that temporal perception uses neural signals involved in non-temporal feature processes and that perceptual organization plays an important role in shaping the experience of elapsed time. We suggest that the temporal representation of objects can be treated as a feature of objects. This new concept implies that psychological time can serve as a tool to study the principles of neural codes in the perception of objects like “reaction time (RT).” Whereas “RT” usually reflects the state of transient signals crossing decision thresholds, “apparent time” in addition reveals the dynamics of sustained signals, thus providing complementary information of what has been obtained from “RT” studies. PMID:25964774

  14. Intellectual Property: a powerful tool to develop biotech research

    PubMed Central

    Giugni, Diego; Giugni, Valter

    2010-01-01

    Summary Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349

  15. Catalogue of space objects and events as a powerful tool for scientific researches on space debris

    NASA Astrophysics Data System (ADS)

    Agapov, V.; Stepanyants, V.; Tuchin, A.; Khutorovsky, Z.

    Wide work on developing and maintenance of the Catalogue of scientific information on space objects and events is continuing at the Keldysh Institute of Applied Mathematics. The work is making in cooperation with Russian company "Space information analytical systems" (KIA Systems). Powerful software tool is developed by now including:- informational core (relational database in RDBMS Oracle 8i environment)with special tools for automatic initial processing and systematization ofdata- software complex for orbital modeling and space objects and eventsdynamical catalogue maintenance- special information - analytical software Informational core covers wide spectrum of data needed for following purposes:- full-scale and high quality modeling of object's motion in near-Earth space(orbital and measurement data, solar flux and geomagnetic indices, Earthrotation parameters etc.)- determination of various events parameters (launches, manoeuvres,fragmentations etc.)- analysis of space debris sources- studying long-term orbital evolution (over several years or tens of years)- other The database is storing huge volume of data including:- optical measurements- TLEs- information about all space launches took place since 1957- information about space missions and programs- manoeuvres- fragmentations- launch sequences for typical orbital insertions- various characteristics for orbital objects (payloads, stages, fragments)- officially released UN and ITU registration data- other By now there are records storing in informational core for more than 28000 orbital objects (both catalogued and not), about all orbital launch attempts since 04.10.1957 (including failed ones), more than 30millions records of orbital information (TLEs, state vectors, polynomial data), more than 200000 optical measurements (normal places) for GEO region objects, calculated data on more than 14 millions of close approaches had taken place during last five years and other data. Software complex for orbital

  16. Ready Reference Tools: EBSCO Topic Search and SIRS Researcher.

    ERIC Educational Resources Information Center

    Goins, Sharon; Dayment, Lu

    1998-01-01

    Discussion of ready reference and current events collections in high school libraries focuses on a comparison of two CD-ROM services, EBSCO Topic Search and the SIRS Researcher. Considers licensing; access; search strategies; viewing articles; currency; printing; added value features; and advantages of CD-ROMs. (LRW)

  17. The Portable Usability Testing Lab: A Flexible Research Tool.

    ERIC Educational Resources Information Center

    Hale, Michael E.; And Others

    A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to any site…

  18. New research and tools lead to improved earthquake alerting protocols

    USGS Publications Warehouse

    Wald, David J.

    2009-01-01

    What’s the best way to get alerted about the occurrence and potential impact of an earthquake? The answer to that question has changed dramatically of late, in part due to improvements in earthquake science, and in part by the implementation of new research in the delivery of earthquake information

  19. Reimagining Science Education and Pedagogical Tools: Blending Research with Teaching

    ERIC Educational Resources Information Center

    McLaughlin, Jacqueline S.

    2010-01-01

    The future of higher education in the sciences will be marked by programs that link skilled educators and research scientists from around the world with teachers for professional development and with students for high-impact learning--either virtually or physically in the field. These programs will use technology where possible to build new and…

  20. Administrative Data Linkage as a Tool for Child Maltreatment Research

    ERIC Educational Resources Information Center

    Brownell, Marni D.; Jutte, Douglas P.

    2013-01-01

    Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…

  1. Miniature spinning as a tool for ginning research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The cotton gin must balance efficient processing and cleaning with adversely affecting the quality of lint through damage and/or failure to remove sufficient material. Substantial research is conducted on all aspects of the cotton gin; however it is difficult to gauge the effect on fiber quality wi...

  2. Friending Adolescents on Social Networking Websites: A Feasible Research Tool

    PubMed Central

    Brockman, Libby N.; Christakis, Dimitri A.; Moreno, Megan A.

    2014-01-01

    Objective Social networking sites (SNSs) are increasingly used for research. This paper reports on two studies examining the feasibility of friending adolescents on SNSs for research purposes. Methods Study 1 took place on www.MySpace.com where public profiles belonging to 18-year-old adolescents received a friend request from an unknown physician. Study 2 took place on www.Facebook.com where college freshmen from two US universities, enrolled in an ongoing research study, received a friend request from a known researcher’s profile. Acceptance and retention rates of friend requests were calculated for both studies. Results Study 1: 127 participants received a friend request; participants were 18 years-old, 62.2% male and 51.8% Caucasian. 49.6% accepted the friend request. After 9 months, 76% maintained the online friendship, 12.7% defriended the study profile and 11% deactivated their profile. Study 2: 338 participants received a friend request; participants were 18 years-old, 56.5% female and 75.1% Caucasian. 99.7% accepted the friend request. Over 12 months, 3.3% defriended the study profile and 4.1% deactivated their profile. These actions were often temporary; the overall 12-month friendship retention rate was 96.1%. Conclusion Friending adolescents on SNSs is feasible and friending adolescents from a familiar profile may be more effective for maintaining online friendship with research participants over time. PMID:25485226

  3. Online Tools Allow Distant Students to Collaborate on Research Projects

    ERIC Educational Resources Information Center

    T.H.E. Journal, 2005

    2005-01-01

    The Wesleyan Academy and Moravian School in St. Thomas, Virgin Islands, recently joined forces with Evergreen Elementary in Fort Lewis, Wash., to collaborate on a research project using My eCoach Online (http://myecoach.com) as the primary medium to share information, post ideas and findings, and develop inquiry projects on 10 topics about water.…

  4. ``Tools for Astrometry": A Windows-based Research Tool for Asteroid Discovery and Measurement

    NASA Astrophysics Data System (ADS)

    Snyder, G. A.; Marschall, L. A.; Good, R. F.; Hayden, M. B.; Cooper, P. R.

    1998-12-01

    We have developed a Windows-based interactive digital astrometry package with a simple, ergonomic interface, designed for the discovery, measurement, and recording of asteroid positions by individual observers. The software, "Tools For Astrometry", will handle FITS and SBIG format images up to 2048 x 2048 (or larger, depending on RAM), and provides features for blinking images or subframes of images, and measurement of positions and magnitudes against both the HST Guide Star Catalog and the USNO SA-1 catalog,. In addition, the program can calculate ephemerides from element tables, including the Lowell Asteroid Database available online, can generate charts of star-fields showing the motion of asteroids from the ephemeris superimposed against the background star field, can project motions of measured asteroids ahead several days using linear interpolation for purposes of reacquisition, and can calculate projected baselines for asteroid parallax measurements. Images, charts, and tables of ephemerides can printed as well as displayed, and reports can be generated in the standard format of the IAU Minor Planet Center. The software is designed ergonomically, and one can go from raw images to completed astrometric report in a matter of minutes. The software is an extension of software developed for introductory astronomy laboratories by Project CLEA, which is supported by grants from Gettysburg College and the National Science Foundation.

  5. Developing a Research Tool to Gauge Student Metacognition

    NASA Astrophysics Data System (ADS)

    McInerny, Alistair; Boudreaux, Andrew; Rishal, Sepideh; Clare, Kelci

    2012-10-01

    Metacognition refers to the family of thought processes and skills used to evaluate and manage learning. A research and curriculum development project underway at Western Washington University uses introductory physics labs as a context to promote students' abilities to learn and apply metacognitive skills. A required ``narrative reflection'' has been incorporated as a weekly end-of-lab assignment. The goal of the narrative reflection is to encourage and support student metacognition while generating written artifacts that can be used by researchers to study metacognition in action. We have developed a Reflective Thinking Rubric (RTR) to analyze scanned narrative reflections. The RTR codes student writing for Metacognitive Elements, identifiable steps or aspects of metacognitive thinking at a variety of levels of sophistication. We hope to use the RTR to monitor the effect of weekly reflection on metacognitive ability and to search for correlations between metacognitive ability and conceptual understanding.

  6. CAMS as a tool for human factors research in spaceflight

    NASA Astrophysics Data System (ADS)

    Sauer, Juergen

    2004-01-01

    The paper reviews a number of research studies that were carried out with a PC-based task environment called Cabin Air Management System (CAMS) simulating the operation of a spacecraft's life support system. As CAMS was a multiple task environment, it allowed the measurement of performance at different levels. Four task components of different priority were embedded in the task environment: diagnosis and repair of system faults, maintaining atmospheric parameters in a safe state, acknowledgement of system alarms (reaction time), and keeping a record of critical system resources (prospective memory). Furthermore, the task environment permitted the examination of different task management strategies and changes in crew member state (fatigue, anxiety, mental effort). A major goal of the research programme was to examine how crew members adapted to various forms of sub-optimal working conditions, such as isolation and confinement, sleep deprivation and noise. None of the studies provided evidence for decrements in primary task performance. However, the results showed a number of adaptive responses of crew members to adjust to the different sub-optimal working conditions. There was evidence for adjustments in information sampling strategies (usually reductions in sampling frequency) as a result of unfavourable working conditions. The results also showed selected decrements in secondary task performance. Prospective memory seemed to be somewhat more vulnerable to sub-optimal working conditions than performance on the reaction time task. Finally, suggestions are made for future research with the CAMS environment.

  7. Modelling as an indispensible research tool in the information society.

    NASA Astrophysics Data System (ADS)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  8. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  9. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  10. Electromagnetic Levitation: A Useful Tool in Microgravity Research

    NASA Technical Reports Server (NTRS)

    Szekely, Julian; Schwartz, Elliot; Hyers, Robert

    1995-01-01

    Electromagnetic levitation is one area of the electromagnetic processing of materials that has uses for both fundamental research and practical applications. This technique was successfully used on the Space Shuttle Columbia during the Spacelab IML-2 mission in July 1994 as a platform for accurately measuring the surface tensions of liquid metals and alloys. In this article, we discuss the key transport phenomena associated with electromagnetic levitation, the fundamental relationships associated with thermophysical property measurement that can be made using this technique, reasons for working in microgravity, and some of the results obtained from the microgravity experiments.

  11. NASA Global Hawk: A New Tool for Earth Science Research

    NASA Technical Reports Server (NTRS)

    Hall, Phill

    2009-01-01

    This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.

  12. The NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi

    2012-01-01

    The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.

  13. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    ERIC Educational Resources Information Center

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  14. Searching for New Directions: Developing MA Action Research Project as a Tool for Teaching

    ERIC Educational Resources Information Center

    Lee, Young Ah; Wang, Ye

    2012-01-01

    Action research has been recognized as a useful professional development tool for teaching, but for inservice teachers, conducting action research can be challenging. Their learning about action research can be influenced by social situations--whether in an MA (Master of Arts) program or other professional development. The purpose of this…

  15. Guiding Independence: Developing a Research Tool to Support Student Decision Making in Selecting Online Information Sources

    ERIC Educational Resources Information Center

    Baildon, Rindi; Baildon, Mark

    2008-01-01

    The development and use of a research tool to guide fourth-grade students' use of information sources during a research project is described in this article. Over a period of five weeks, 21 fourth-grade students in an international school in Singapore participated in a study investigating the extent to which the use of a "research resource guide"…

  16. Conceptualising the Use of Facebook in Ethnographic Research: As Tool, as Data and as Context

    ERIC Educational Resources Information Center

    Baker, Sally

    2013-01-01

    This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…

  17. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research

    PubMed Central

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people’s pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan’s implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model’s membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises. PMID:26981163

  18. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research.

    PubMed

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people's pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan's implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model's membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises. PMID:26981163

  19. Experimental and Analytical Research on Resonance Phenomena of Vibrating Head with MRE Regulating Element

    NASA Astrophysics Data System (ADS)

    Miedzińska, D.; Gieleta, R.; Osiński, J.

    2015-02-01

    A vibratory pile hammer (VPH) is a mechanical device used to drive steel piles as well as tube piles into soil to provide foundation support for buildings or other structures. In order to increase the stability and the efficiency of the VPH work in the over-resonance frequency, a new VPH construction was developed at the Military University of Technology. The new VPH contains a system of counter-rotating eccentric weights, powered by hydraulic motors, and designed in such a way that horizontal vibrations cancel out, while vertical vibrations are transmitted into the pile. This system is suspended in the static parts by the adaptive variable stiffness pillows based on a smart material, magnetorheological elastomer (MRE), whose rheological and mechanical properties can be reversibly and rapidly controlled by an external magnetic field. The work presented in the paper is a part of the modified VPH construction design process. It concerns the experimental research on the vibrations during the piling process and the analytical analyses of the gained signal. The results will be applied in the VPH control system.

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What analytical...

  1. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What analytical...

  2. Industrial Analytical Chemistry: The Eyes, Ears, and Handmaiden to Research and Development.

    ERIC Educational Resources Information Center

    Thorpe, Thomas M.

    1986-01-01

    Addresses these three questions: (1) What are the roles of analytical chemists in industry? (2) What training is needed to fill assignments which make up these roles? and (3) What are some of the major challenges facing analytical chemists during the next five years? Includes information about a workshop for students. (JN)

  3. Nucleic Acid Aptamers: Research Tools in Disease Diagnostics and Therapeutics

    PubMed Central

    Yadava, Pramod K.

    2014-01-01

    Aptamers are short sequences of nucleic acid (DNA or RNA) or peptide molecules which adopt a conformation and bind cognate ligands with high affinity and specificity in a manner akin to antibody-antigen interactions. It has been globally acknowledged that aptamers promise a plethora of diagnostic and therapeutic applications. Although use of nucleic acid aptamers as targeted therapeutics or mediators of targeted drug delivery is a relatively new avenue of research, one aptamer-based drug “Macugen” is FDA approved and a series of aptamer-based drugs are in clinical pipelines. The present review discusses the aspects of design, unique properties, applications, and development of different aptamers to aid in cancer diagnosis, prevention, and/or treatment under defined conditions. PMID:25050359

  4. Electrostatic Levitation: A Tool to Support Materials Research in Microgravity

    NASA Technical Reports Server (NTRS)

    Rogers, Jan; SanSoucie, Mike

    2012-01-01

    Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.

  5. The Spallation Neutron Source: A powerful tool for materials research

    SciTech Connect

    Mason, Thom; Anderson, Ian S; Ankner, John Francis; Egami, Takeshi; Ekkebus, Allen E; Herwig, Kenneth W; Hodges, Jason P; Horak, Charlie M; Horton, Linda L; Klose, Frank Richard; Mesecar, Andrew D.; Myles, Dean A A; Ohl, M.; Zhao, Jinkui

    2006-01-01

    When completed in 2006, the Spallation Neutron Source (SNS) will use an accelerator to produce the most intense beams of pulsed neutrons in the world. This unique facility is being built by a collaboration of six US Department of Energy laboratories and will serve a diverse community of users drawn from academia, industry, and government labs. The project continues on schedule and within budget, with commissioning and installation of all systems going well. Installation of 14 state-of-the-art instruments is under way, and design work is being completed for several others. These new instruments will enable inelastic and elastic-scattering measurements across a broad range of science such as condensed-matter physics, chemistry, engineering materials, biology, and beyond. Neutron Science at SNS will be complemented by research opportunities at several other facilities under way at Oak Ridge National Laboratory.

  6. Cell stretching devices as research tools: engineering and biological considerations.

    PubMed

    Kamble, Harshad; Barton, Matthew J; Jun, Myeongjun; Park, Sungsu; Nguyen, Nam-Trung

    2016-08-16

    Cells within the human body are subjected to continuous, cyclic mechanical strain caused by various organ functions, movement, and growth. Cells are well known to have the ability to sense and respond to mechanical stimuli. This process is referred to as mechanotransduction. A better understanding of mechanotransduction is of great interest to clinicians and scientists alike to improve clinical diagnosis and understanding of medical pathology. However, the complexity involved in in vivo biological systems creates a need for better in vitro technologies, which can closely mimic the cells' microenvironment using induced mechanical strain. This technology gap motivates the development of cell stretching devices for better understanding of the cell response to mechanical stimuli. This review focuses on the engineering and biological considerations for the development of such cell stretching devices. The paper discusses different types of stretching concepts, major design consideration and biological aspects of cell stretching and provides a perspective for future development in this research area. PMID:27440436

  7. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  8. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  9. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  10. Advanced imaging microscope tools applied to microgravity research investigations

    NASA Astrophysics Data System (ADS)

    Peterson, L.; Samson, J.; Conrad, D.; Clark, K.

    1998-01-01

    The inability to observe and interact with experiments on orbit has been an impediment for both basic research and commercial ventures using the shuttle. In order to open the frontiers of space, the Center for Microgravity Automation Technology has developed a unique and innovative system for conducting experiments at a distance, the ``Remote Scientist.'' The Remote Scientist extends laboratory automation capability to the microgravity environment. While the Remote Scientist conceptually encompasses a broad spectrum of elements and functionalities, the development approach taken is to: • establish a baseline capability that is both flexible and versatile • incrementally augment the baseline with additional functions over time. Since last year, the application of the Remote Scientist has changed from protein crystal growth to tissue culture, specifically, the development of skeletal muscle under varying levels of tension. This system includes a series of bioreactor chambers that allow for three-dimensional growth of muscle tissue on a membrane suspended between the two ends of a programmable force transducer that can provide automated or investigator-initiated tension on the developing tissue. A microscope objective mounted on a translation carriage allows for high-resolution microscopy along a large area of the tissue. These images will be mosaiced on orbit to detect features and structures that span multiple images. The use of fluorescence and pseudo-confocal microscopy will maximize the observational capabilities of this system. A series of ground-based experiments have been performed to validate the bioreactor, the force transducer, the translation carriage and the image acquisition capabilities of the Remote Scientist. • The bioreactor is capable of sustaining three dimensional tissue culture growth over time. • The force transducer can be programmed to provide static tension on cells or to simulate either slow or fast growth of underlying tissues in

  11. Citizen Science as a New Tool in Dog Cognition Research.

    PubMed

    Stewart, Laughlin; MacLean, Evan L; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology. PMID:26376443

  12. Citizen Science as a New Tool in Dog Cognition Research

    PubMed Central

    Stewart, Laughlin; MacLean, Evan L.; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W.; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology. PMID:26376443

  13. Microwave transmission, a new tool in forest hydrological research

    NASA Astrophysics Data System (ADS)

    Bouten, W.; Swart, P. J. F.; De Water, E.

    1991-04-01

    After several decades of interception studies, there are still considerable gaps in the understanding of wet-canopy evaporation. Model development is being obstructed by the lack of techniques for the measurement of state and rate variables which have to be quantified for model validation. The applicability of microwave attenuation measurements for the determination of canopy wetness is examined. The attenuation caused by a single spruce fir in the laboratory and the vertical attenuation profiles of a Douglas fir stand were measured under dry and wet conditions. The results indicate an instant increase of the attenuation upon wetting and a decrease owing to drip and evaporation after rainfall ceased. From the results, conclusions have been drawn on the design of instrumentation for an optimized measuring system which is suitable for unattended automated scanning of canopy water storage. This system has been calibrated, using vertically integrated microwave attenuation profiles and canopy water budgets from precipitation and throughfall measurements. This system will be used for a forest hydrological study in the framework of the Dutch ACIFORN project, a research project on the effect of atmospheric deposition on Douglas fir vitality.

  14. A Review of Knowledge Gaps and Tools for Orbivirus Research.

    PubMed

    Drolet, Barbara S; van Rijn, Piet; Howerth, Elizabeth W; Beer, Martin; Mertens, Peter P

    2015-06-01

    Although recognized as causing emerging and re-emerging disease outbreaks worldwide since the late 1800 s, there has been growing interest in the United States and Europe in recent years in orbiviruses, their insect vectors, and the diseases they cause in domestic livestock and wildlife. This is due, in part, to the emergence of bluetongue (BT) in northern Europe in 2006-2007 resulting in a devastating outbreak, as well as severe BT outbreaks in sheep and epizootic hemorrhagic disease (EHD) outbreaks in deer and cattle in the United States. Of notable concern is the isolation of as many as 10 new BT virus (BTV) serotypes in the United States since 1999 and their associated unknowns, such as route of introduction, virulence to mammals, and indigenous competent vectors. This review, based on a gap analysis workshop composed of international experts on orbiviruses conducted in 2013, gives a global perspective of current basic virological understanding of orbiviruses, with particular attention to BTV and the closely related epizootic hemorrhagic disease virus (EHDV), and identifies a multitude of basic virology research gaps, critical for predicting and preventing outbreaks. PMID:26086555

  15. miRQuest: integration of tools on a Web server for microRNA research.

    PubMed

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-01-01

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/. PMID:27050998

  16. Emerging Imaging Tools for Use with Traumatic Brain Injury Research

    PubMed Central

    Wilde, Elisabeth A.; Tong, Karen A.; Holshouser, Barbara A.

    2012-01-01

    Abstract This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children. PMID:21787167

  17. Community Coordinated Modeling Center (CCMC): Providing Access to Space Weather Models and Research Support Tools

    NASA Astrophysics Data System (ADS)

    Chulaki, A.; Bakshi, S. S.; Berrios, D.; Hesse, M.; Kuznetsova, M. M.; Lee, H.; MacNeice, P. J.; Mendoza, A. M.; Mullinix, R.; Patel, K. D.; Pulkkinen, A.; Rastaetter, L.; Shim, J.; Taktakishvili, A.; Zheng, Y.

    2011-12-01

    The Community Coordinated Modeling Center at NASA, Goddard Space flight Center, provides access to state-of-the-art space weather models to the research community. The majority of the models residing at the CCMC are comprehensive computationally intensive physics-based models. The CCMC also provides free services and tools to assist the research community in analyzing the results from the space weather model simulations. We present an overview of the available tools and services at the CCMC: the Runs-On-Request system, the online visualization, the Kameleon access and interpolation library and the Metrics Challenge tools suite.

  18. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  19. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  20. ResearchIQ: Design of a Semantically Anchored Integrative Query Tool

    PubMed Central

    Lele, Omkar; Raje, Satyajeet; Yen, Po-Yin; Payne, Philip

    2015-01-01

    An important factor influencing the pace of research activity is the ability of researchers to discover and leverage heterogeneous resources. Usually, researcher profiles, laboratory equipment, data samples, clinical trials, and other research resources are stored in heterogeneous datasets in large organizations. Emergent semantic web technologies provide novel approaches to discover, annotate and consequently link such resources. In this manuscript, we describe the design of Research Integrative Query (ResearchIQ) tool, a semantically anchored resource discovery platform that facilitates semantic discovery of local and publically available data through a single web portal designed for researchers in the biomedical informatics domain within The Ohio State University. PMID:26306248

  1. The Science of Analytic Reporting

    SciTech Connect

    Chinchor, Nancy; Pike, William A.

    2009-09-23

    The challenge of visually communicating analysis results is central to the ability of visual analytics tools to support decision making and knowledge construction. The benefit of emerging visual methods will be improved through more effective exchange of the insights generated through the use of visual analytics. This paper outlines the major requirements for next-generation reporting systems in terms of eight major research needs: the development of best practices, design automation, visual rhetoric, context and audience, connecting analysis to presentation, evidence and argument, collaborative environments, and interactive and dynamic documents. It also describes an emerging technology called Active Products that introduces new techniques for analytic process capture and dissemination.

  2. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  3. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  4. From research to management: A suite of GIS-based watershed modeling, assessment and planning tools 1889

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Automated Geospatial Watershed Assessment (AGWA) tool is a GIS-based hydrologic modeling tool developed jointly by the U.S. EPA Office of Research and Development, USDA Agricultural Research Service, and University of Arizona. It was initially designed as a research tool for assessing the hydro...

  5. The Notion of the Relationship to Knowledge: A Theoretical Tool for Research in Science Education

    ERIC Educational Resources Information Center

    Pouliot, Chantal; Bader, Barbara; Therriault, Genevieve

    2010-01-01

    This article pursues a dual objective. First, it seeks to present the notion of the relationship to knowledge as a valuable theoretical tool for science education research. Secondly, it aims to illustrate how this notion has been operationalized in recent research conducted in Quebec (Canada) that focuses on teachers' and students' relationship to…

  6. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    ERIC Educational Resources Information Center

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  7. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  8. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  9. Toward a quality guide to facilitate the transference of analytical methods from research to testing laboratories: a case study.

    PubMed

    Bisetty, Krisnha; Gumede, Njabulo Joyfull; Escuder-Gilabert, Laura; Sagrado, Salvador

    2009-01-01

    At present, there is no single viewpoint that defines QA strategies in analytical chemistry. On the other hand, there are no unique protocols defining a set of analytical tasks and decision criteria to be performed during the method development phase (e.g., by a single research laboratory) in order to facilitate the transference to the testing laboratories intending to adapt, validate, and routinely use this method. This study proposes general criteria, a priori valid for any developed method, recommended as a provisional quality guide containing the minimum internal tasks necessary to publish new analytical method results. As an application, the selection of some basic internal quality tasks and the corresponding accepted criteria are adapted to a concrete case study: indirect differential pulse polarographic determination of nitrate in water samples according to European Commission requisites. Extra tasks to be performed by testing laboratories are also outlined. PMID:20166601

  10. The ABCs of incentive-based treatment in health care: a behavior analytic framework to inform research and practice

    PubMed Central

    Meredith, Steven E; Jarvis, Brantley P; Raiff, Bethany R; Rojewski, Alana M; Kurti, Allison; Cassidy, Rachel N; Erb, Philip; Sy, Jolene R; Dallery, Jesse

    2014-01-01

    Behavior plays an important role in health promotion. Exercise, smoking cessation, medication adherence, and other healthy behavior can help prevent, or even treat, some diseases. Consequently, interventions that promote healthy behavior have become increasingly common in health care settings. Many of these interventions award incentives contingent upon preventive health-related behavior. Incentive-based interventions vary considerably along several dimensions, including who is targeted in the intervention, which behavior is targeted, and what type of incentive is used. More research on the quantitative and qualitative features of many of these variables is still needed to inform treatment. However, extensive literature on basic and applied behavior analytic research is currently available to help guide the study and practice of incentive-based treatment in health care. In this integrated review, we discuss how behavior analytic research and theory can help treatment providers design and implement incentive-based interventions that promote healthy behavior. PMID:24672264

  11. High-resolution continuum source electrothermal atomic absorption spectrometry — An analytical and diagnostic tool for trace analysis

    NASA Astrophysics Data System (ADS)

    Welz, Bernhard; Borges, Daniel L. G.; Lepri, Fábio G.; Vale, Maria Goreti R.; Heitmann, Uwe

    2007-09-01

    The literature about applications of high-resolution continuum source atomic absorption spectrometry (HR-CS AAS) with electrothermal atomization is reviewed. The historic development of HR-CS AAS is briefly summarized and the main advantages of this technique, mainly the 'visibility' of the spectral environment around the analytical line at high resolution and the unequaled simultaneous background correction are discussed. Simultaneous multielement CS AAS has been realized only in a very limited number of cases. The direct analysis of solid samples appears to have gained a lot from the special features of HR-CS AAS, and the examples from the literature suggest that calibration can be carried out against aqueous standards. Low-temperature losses of nickel and vanadyl porphyrins could be detected and avoided in the analysis of crude oil due to the superior background correction system. The visibility of the spectral environment around the analytical line revealed that the absorbance signal measured for phosphorus at the 213.6 nm non-resonance line without a modifier is mostly due to the PO molecule, and not to atomic phosphorus. The future possibility to apply high-resolution continuum source molecular absorption for the determination of non-metals is discussed.

  12. TREND: a tool for rapid online research literature analysis and quantification.

    PubMed

    Landers, Richard N

    2008-08-01

    The Research Explicator for oNline Databases (TREND) tool was developed out of a need to quantify large research literatures rapidly and objectively on the basis of online research database output. By parsing such output with TREND, a researcher can in minutes extract the most commonly cited articles, the most frequently published authors, a distribution of publication dates, and a variety of other information from a research literature several thousand articles in size. This tool thus enables an increase in productivity both for researchers venturing into new areas of interest and for advisors and instructors putting together core reading lists. The processing of citations from articles represents a unique challenge, however, because deviations from strict APA formatting cause problems that are sometimes difficult to correct mechanically. A case study of one particularly troublesome citation (Baron & Kenny, 1986) is presented. Usage and implications are discussed. PMID:18697661

  13. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  14. Communication research between working capacity of hard- alloy cutting tools and fractal dimension of their wear

    NASA Astrophysics Data System (ADS)

    Arefiev, K.; Nesterenko, V.; Daneykina, N.

    2016-06-01

    The results of communication research between the wear resistance of the K applicability hard-alloy cutting tools and the fractal dimension of the wear surface, which is formed on a back side of the cutting edge when processing the materials showing high adhesive activity are presented in the paper. It has been established that the wear resistance of tested cutting tools samples increases according to a fractal dimension increase of their wear surface.

  15. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  16. A Tool for Measuring NASA's Aeronautics Research Progress Toward Planned Strategic Community Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.

  17. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools

    PubMed Central

    Blevins, Meridith; Wehbe, Firas H.; Rebeiro, Peter F.; Caro-Vega, Yanink; McGowan, Catherine C.; Shepherd, Bryan E.

    2016-01-01

    Objective To develop and disseminate tools for interactive visualization of HIV cohort data. Design and Methods If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language). The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP), and our implementation utilized Caribbean, Central and South America network (CCASAnet) data. Results This tool currently presents patient-level data in three classes of plots: (1) Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2) Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3) Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART) initiation, CD4 trajectories after ART initiation, and mortality. Conclusions We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community. PMID:26963255

  18. Scientific Mobility and International Research Networks: Trends and Policy Tools for Promoting Research Excellence and Capacity Building

    ERIC Educational Resources Information Center

    Jacob, Merle; Meek, V. Lynn

    2013-01-01

    One of the ways in which globalization is manifesting itself in higher education and research is through the increasing importance and emphasis on scientific mobility. This article seeks to provide an overview and analysis of current trends and policy tools for promoting mobility. The article argues that the mobility of scientific labour is an…

  19. Numerical and analytical research of the impact of decoherence on quantum circuits

    NASA Astrophysics Data System (ADS)

    Bogdanov, Yu. I.; Chernyavskiy, A. Yu.; Bantysh, B. I.; Lukichev, V. F.; Orlikovsky, A. A.; Semenihin, I. A.; Fastovets, D. V.; Holevo, A. S.

    2014-12-01

    Three different levels of noisy quantum schemes modeling are considered: vectors, density matrices and Choi- Jamiolkowski related states. The implementations for personal computers and supercomputers are described, and the corresponding results are shown. For the level of density matrices, we present the technique of the fixed rank approximation and show some analytical estimates of the fidelity level.

  20. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  1. About Skinner and Time: Behavior-Analytic Contributions to Research on Animal Timing

    ERIC Educational Resources Information Center

    Lejeune, Helga; Richelle, Marc; Wearden, J. H.

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in "The Behavior of Organisms," through the rate…

  2. Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993

    ERIC Educational Resources Information Center

    O'Driscoll, Gillian A.; Callahan, Brandy L.

    2008-01-01

    Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…

  3. ANALYTIC ELEMENT GROUND WATER MODELING AS A RESEARCH PROGRAM (1980-2006)

    EPA Science Inventory

    Scientists and engineers who use the analytic element method (AEM) for solving problems of regional ground water flow may be considered a community, and this community can be studied from the perspective of history and philosophy of science. Applying the methods of the Hungarian...

  4. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  5. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  6. Laboratory Research in Catalysis: Coordinating Undergraduate Analytical, Organic, and Physical Chemistry

    ERIC Educational Resources Information Center

    Rondini, Jo-Ann; And Others

    1975-01-01

    Describes a laboratory experiment designed to merge the concepts and techniques of the analytical-organic-physical subdivisions and introduce the student to a decision-making situation. Presents a discussion of the use of the experiment in attaining these goals and provides typical data obtained by students. (GS)

  7. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research

    PubMed Central

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices. PMID:26702217

  8. The Research-Teaching Nexus: Using a Construction Teaching Event as a Research Tool

    ERIC Educational Resources Information Center

    Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday

    2016-01-01

    In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…

  9. [Research on Quantitative Analytical Model for Determination of Phosmet by Using Surface Enhanced Raman Spectroscopy].

    PubMed

    Hao, Yong; Chen, Bin

    2015-09-01

    Raman spectroscopy combined with surface enhanced technology was adopted for analysis of phosmet pesticide. Continuous wavelet transforms (CWT) and successive projections algorithm (SPA) were used for Raman spectral preprocess and characteristic Raman shifts selection, respectively. Multi-linear regression (MLR) was used for spectral modeling. It is shown that enhanced chips can achieve enhanced Raman spectral signal for low concentration of pesticides. CWT can improve spectral resolution and smoothness, and remove translation error. Characteristic Raman shifts selection method of SPA can improve analytical precision, and simplify modeling variables of MLR. CWT-SPA-MLR model can improve correlation coefficient (r) of prediction from 0.823 to 0.903, and reduce root mean square error of prediction (RMSEP) from 1.640 to 1.122. CWT-SPA-MLR method can be used for constructing analytical models for Raman spectra and has good interpretability and repeatability. PMID:26669168

  10. Research on bathymetry estimation by Worldview-2 based with the semi-analytical model

    NASA Astrophysics Data System (ADS)

    Sheng, L.; Bai, J.; Zhou, G.-W.; Zhao, Y.; Li, Y.-C.

    2015-04-01

    South Sea Islands of China are far away from the mainland, the reefs takes more than 95% of south sea, and most reefs scatter over interested dispute sensitive area. Thus, the methods of obtaining the reefs bathymetry accurately are urgent to be developed. Common used method, including sonar, airborne laser and remote sensing estimation, are limited by the long distance, large area and sensitive location. Remote sensing data provides an effective way for bathymetry estimation without touching over large area, by the relationship between spectrum information and bathymetry. Aimed at the water quality of the south sea of China, our paper develops a bathymetry estimation method without measured water depth. Firstly the semi-analytical optimization model of the theoretical interpretation models has been studied based on the genetic algorithm to optimize the model. Meanwhile, OpenMP parallel computing algorithm has been introduced to greatly increase the speed of the semi-analytical optimization model. One island of south sea in China is selected as our study area, the measured water depth are used to evaluate the accuracy of bathymetry estimation from Worldview-2 multispectral images. The results show that: the semi-analytical optimization model based on genetic algorithm has good results in our study area;the accuracy of estimated bathymetry in the 0-20 meters shallow water area is accepted.Semi-analytical optimization model based on genetic algorithm solves the problem of the bathymetry estimation without water depth measurement. Generally, our paper provides a new bathymetry estimation method for the sensitive reefs far away from mainland.

  11. Research-tool patents: issues for health in the developing world.

    PubMed Central

    Barton, John H.

    2002-01-01

    The patent system is now reaching into the tools of medical research, including gene sequences themselves. Many of the new patents can potentially preempt large areas of medical research and lay down legal barriers to the development of a broad category of products. Researchers must therefore consider redesigning their research to avoid use of patented techniques, or expending the effort to obtain licences from those who hold the patents. Even if total licence fees can be kept low, there are enormous negotiation costs, and one "hold-out" may be enough to lead to project cancellation. This is making it more difficult to conduct research within the developed world, and poses important questions for the future of medical research for the benefit of the developing world. Probably the most important implication for health in the developing world is the possible general slowing down and complication of medical research. To the extent that these patents do slow down research, they weaken the contribution of the global research community to the creation and application of medical technology for the benefit of developing nations. The patents may also complicate the granting of concessional prices to developing nations - for pharmaceutical firms that seek to offer a concessional price may have to negotiate arrangements with research-tool firms, which may lose royalties as a result. Three kinds of response are plausible. One is to develop a broad or global licence to permit the patented technologies to be used for important applications in the developing world. The second is to change technical patent law doctrines. Such changes could be implemented in developed and developing nations and could be quite helpful while remaining consistent with TRIPS. The third is to negotiate specific licence arrangements, under which specific research tools are used on an agreed basis for specific applications. These negotiations are difficult and expensive, requiring both scientific and

  12. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  13. Knowledge Translation Tools are Emerging to Move Neck Pain Research into Practice

    PubMed Central

    MacDermid, Joy C.; Miller, Jordan; Gross, Anita R.

    2013-01-01

    Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain—a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain. PMID:24155807

  14. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  15. Production Workers' Literacy and Numeracy Practices: Using Cultural-Historical Activity Theory (CHAT) as an Analytical Tool

    ERIC Educational Resources Information Center

    Yasukawa, Keiko; Brown, Tony; Black, Stephen

    2013-01-01

    Public policy discourses claim that there is a "crisis" in the literacy and numeracy levels of the Australian workforce. In this paper, we propose a methodology for examining this "crisis" from a critical perspective. We draw on findings from an ongoing research project by the authors which investigates production workers'…

  16. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    NASA Astrophysics Data System (ADS)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    The Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software framework to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. SEATREE is open source and community developed, distributed freely under the GNU General Public License. It is a fully contained package that lets users operate in a graphical mode, while giving more advanced users the opportunity to view and modify the source code. Top level graphical user interfaces which initiate the calculations and visualize results, are written in the Python programming language using an object-oriented, modern design. Results are plotted with either Matlab-like Python libraries, or SEATREE’s own Generic Mapping Tools wrapper. The underlying computational codes used to produce the results can be written in any programming language and accessed through Python wrappers. There are currently four fully developed science modules for SEATREE: (1) HC is a global geodynamics tool based on a semi-analytical mantle-circulation program based on work by B. Steinberger, Becker, and C. O'Neill. HC can compute velocities and tractions for global, spherical Stokes flow and radial viscosity variations. HC is fast enough to be used for classroom instruction, for example to let students interactively explore the role of radial viscosity variations for global geopotential (geoid) anomalies. (2) ConMan wraps Scott King’s 2D finite element mantle convection code, allowing users to quickly observe how modifications to input parameters affect heat flow over time. As seismology modules, SEATREE includes, (3), Larry, a global, surface wave phase-velocity inversion tool and, (4), Syn2D, a Cartesian tomography teaching tool for ray-theory wave propagation in synthetic, arbitrary velocity structure in the presence of noise. Both underlying programs were contributed by Boschi. Using Syn2D, students can explore, for example, how well a given

  17. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  18. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  19. Family Myths, Beliefs, and Customs as a Research/Educational Tool to Explore Identity Formation

    ERIC Educational Resources Information Center

    Herman, William E.

    2008-01-01

    This paper outlines a qualitative research tool designed to explore personal identity formation as described by Erik Erikson and offers self-reflective and anonymous evaluative comments made by college students after completing this task. Subjects compiled a list of 200 myths, customs, fables, rituals, and beliefs from their family of origin and…

  20. Improving the Usefulness of Concept Maps as a Research Tool for Science Education

    ERIC Educational Resources Information Center

    Van Zele, Els; Lenaerts, Josephina; Wieme, Willem

    2004-01-01

    The search for authentic science research tools to evaluate student understanding in a hybrid learning environment with a large multimedia component has resulted in the use of concept maps as a representation of student's knowledge organization. One hundred and seventy third-semester introductory university-level engineering students represented…

  1. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    ERIC Educational Resources Information Center

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  2. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  3. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    ERIC Educational Resources Information Center

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  4. Community-based research as a tool for empowerment: the Haida Gwaii Diabetes Project example.

    PubMed

    Herbert, C P

    1996-01-01

    The evolution of the Haida Gwaii Diabetes Project exemplifies how community-based family practice research can be a tool for empowerment for both the community of research participants and the community based members of the research team. The aims of the project are to develop a better understanding of Haida beliefs about diabetes; to develop culturally sensitive approaches to prevention and management; and to attempt to apply this understanding to the development of a model for preventive health for native people in the province of British Columbia. A participatory research paradigm, coupled with explicit working principles by which the research team agreed to operate, addressed the concerns that the Aboriginal community had about the risks of research. A true working partnership has developed among all members of the research team, and with the Haida community. PMID:8753639

  5. DMPwerkzeug - A tool to support the planning, implementation, and organization of research data management.

    NASA Astrophysics Data System (ADS)

    Klar, Jochen; Engelhardt, Claudia; Neuroth, Heike; Enke, Harry

    2016-04-01

    Following the call to make the results of publicly funded research openly accessible, more and more funding agencies demand the submission of a data management plan (DMP) as part of the application process. These documents specify, how the data management of the project is organized and what datasets will be published when. Of particular importance for European researchers is the Open Data Research Pilot of Horizon 2020 which requires data management plans for a set of 9 selected research fields from social sciences to nanotechnology. In order to assist the researchers creating these documents, several institutions developed dedicated software tools. The most well-known are DMPonline by the Digital Curation Centre (DCC) and DMPtool by the California Digital Library (CDL) - both extensive and well received web applications. The core functionality of these tools is the assisted editing of the DMP templates provided by the particular funding agency.While this is certainly helpful, especially in an environment with a plethora of different funding agencies like the UK or the USA, these tools are somewhat limited to this particular task and don't utilise the full potential of DMP. Beyond the purpose of fulfilling funder requirements, DMP can be useful for a number of additional tasks. In the initial conception phase of a project, they can be used as a planning tool to determine which date management activities and measures are necessary throughout the research process, to assess which resources are needed, and which institutions (computing centers, libraries, data centers) should be involved. During the project, they can act as a constant reference or guideline for the handling of research data. They also determine where the data will be stored after the project has ended and whether it can be accessed by the public, helping to take into account resulting requirements of the data center or actions necessary to ensure re-usability by others from early on. Ideally, a DMP

  6. An emerging micro-scale immuno-analytical diagnostic tool to see the unseen. Holding promise for precision medicine and P4 medicine.

    PubMed

    Guzman, Norberto A; Guzman, Daniel E

    2016-05-15

    Over the years, analytical chemistry and immunology have contributed significantly to the field of clinical diagnosis by introducing quantitative techniques that can detect crucial and distinct chemical, biochemical and cellular biomarkers present in biosamples. Currently, quantitative two-dimensional hybrid immuno-analytical separation technologies are emerging as powerful tools for the sequential isolation, separation and detection of protein panels, including those with subtle structural changes such as variants, isoforms, peptide fragments, and post-translational modifications. One such technique to perform this challenging task is immunoaffinity capillary electrophoresis (IACE), which combines the use of antibodies and/or other affinity ligands as highly selective capture agents with the superior resolving power of capillary electrophoresis. Since affinity ligands can be polyreactive, i.e., binding and capturing more than one molecule, they may generate false positive results when tested under mono-dimensional procedures; one such application is enzyme-linked immunosorbent assay (ELISA). IACE, on the other hand, is a two-dimensional technique that captures (isolation and enrichment), releases, separates and detects (quantification, identification and characterization) a single or a panel of analytes from a sample, when coupled to one or more detectors simultaneously, without the presence of false positive or false negative data. This disruptive technique, capable of preconcentrate on-line results in enhanced sensitivity even in the analysis of complex matrices, may change the traditional system of testing biomarkers to obtain more accurate diagnosis of diseases, ideally before symptoms of a specific disease manifest. In this manuscript, we will present examples of the determination of biomarkers by IACE and the design of a miniaturized multi-dimensional IACE apparatus capable of improved sensitivity, specificity and throughput, with the potential of being used

  7. Challenges for Visual Analytics

    SciTech Connect

    Thomas, James J.; Kielman, Joseph

    2009-09-23

    Visual analytics has seen unprecedented growth in its first five years of mainstream existence. Great progress has been made in a short time, yet great challenges must be met in the next decade to provide new technologies that will be widely accepted by societies throughout the world. This paper sets the stage for some of those challenges in an effort to provide the stimulus for the research, both basic and applied, to address and exceed the envisioned potential for visual analytics technologies. We start with a brief summary of the initial challenges, followed by a discussion of the initial driving domains and applications, as well as additional applications and domains that have been a part of recent rapid expansion of visual analytics usage. We look at the common characteristics of several tools illustrating emerging visual analytics technologies, and conclude with the top ten challenges for the field of study. We encourage feedback and collaborative participation by members of the research community, the wide array of user communities, and private industry.

  8. Meta-Analytic Synthesis of Studies Conducted at Marzano Research Laboratory on Instructional Strategies

    ERIC Educational Resources Information Center

    Haystead, Mark W.; Marzano, Robert J.

    2009-01-01

    This is a summary of 300 plus studies from Marzano Research Laboratory (MRL) on instructional strategies. This report synthesizes a series of action research projects conducted between the fall of 2004 and the spring of 2009. The data used for analysis can be found in MRL's Action Research Meta-Analysis Database. Appended are: (1) Instructions for…

  9. An Analytic Study of the Professional Development Research in Early Childhood Education

    ERIC Educational Resources Information Center

    Schachter, Rachel E.

    2015-01-01

    Research Findings: The goal of this study was to examine empirical research on the design, delivery, and measurement of the effects of professional development (PD) for early childhood educators in order to provide insight into what the field has accomplished as well as suggest directions for future PD programs and research. Through the use of…

  10. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. PMID:26450973

  11. An experimental and analytical method for approximate determination of the tilt rotor research aircraft rotor/wing download

    NASA Technical Reports Server (NTRS)

    Jordon, D. E.; Patterson, W.; Sandlin, D. R.

    1985-01-01

    The XV-15 Tilt Rotor Research Aircraft download phenomenon was analyzed. This phenomenon is a direct result of the two rotor wakes impinging on the wing upper surface when the aircraft is in the hover configuration. For this study the analysis proceeded along tow lines. First was a method whereby results from actual hover tests of the XV-15 aircraft were combined with drag coefficient results from wind tunnel tests of a wing that was representative of the aircraft wing. Second, an analytical method was used that modeled that airflow caused gy the two rotors. Formulas were developed in such a way that acomputer program could be used to calculate the axial velocities were then used in conjunction with the aforementioned wind tunnel drag coefficinet results to produce download values. An attempt was made to validate the analytical results by modeling a model rotor system for which direct download values were determinrd..

  12. Stone tool analysis and human origins research: some advice from Uncle Screwtape.

    PubMed

    Shea, John J

    2011-01-01

    The production of purposefully fractured stone tools with functional, sharp cutting edges is a uniquely derived hominin adaptation. In the long history of life on earth, only hominins have adopted this remarkably expedient and broadly effective technological strategy. In the paleontological record, flaked stone tools are irrefutable proof that hominins were present at a particular place and time. Flaked stone tools are found in contexts ranging from the Arctic to equatorial rainforests and on every continent except Antarctica. Paleolithic stone tools show complex patterns of variability, suggesting that they have been subject to the variable selective pressures that have shaped so many other aspects of hominin behavior and morphology. There is every reason to expect that insights gained from studying stone tools should provide vital and important information about the course of human evolution. And yet, one senses that archeological analyses of Paleolithic stone tools are not making as much of a contribution as they could to the major issues in human origins research. PMID:22034103

  13. Virtual Globes and Glacier Research: Integrating research, collaboration, logistics, data archival, and outreach into a single tool

    NASA Astrophysics Data System (ADS)

    Nolan, M.

    2006-12-01

    Virtual Globes are a paradigm shift in the way earth sciences are conducted. With these tools, nearly all aspects of earth science can be integrated from field science, to remote sensing, to remote collaborations, to logistical planning, to data archival/retrieval, to PDF paper retriebal, to education and outreach. Here we present an example of how VGs can be fully exploited for field sciences, using research at McCall Glacier, in Arctic Alaska.

  14. Building genetic tools in Drosophila research: an interview with Gerald Rubin.

    PubMed

    2016-04-01

    Gerald (Gerry) Rubin, pioneer inDrosophilagenetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship - an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establishDrosophilaas an important model in translational research. Describing himself as a 'tool builder', his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms. PMID:27053132

  15. Development of an Accessible Self-Assessment Tool for Research Ethics Committees in Developing Countries

    PubMed Central

    Sleem, Hany; Abdelhai, Rehab Abdelhai Ahmed; Al-Abdallat, Imad; Al-Naif, Mohammed; Gabr, Hala Mansour; Kehil, Et-taher; Sadiq, Bakr Bin; Yousri, Reham; Elsayed, Dyaeldin; Sulaiman, Suad; Silverman, Henry

    2011-01-01

    In response to increased research being performed in developing countries, many research ethics committees (RECs) have been established, but the quality of their ethics review systems remains unknown. Evaluating the performance of an REC remains a challenging task. Absent an accreditation process, a self-assessment mechanism would provide RECs a way to review their policies and processes against recognized international standards. We describe a self-assessment tool that was developed and reviewed by REC members and researchers from the Middle East. This tool reflects pragmatic aspects of human subjects protection, is based on international standards, is straightforward in its completion, and its items are relevant to the administrative processes that exist in many RECs in the developing world. PMID:20831423

  16. Building genetic tools in Drosophila research: an interview with Gerald Rubin

    PubMed Central

    2016-01-01

    Gerald (Gerry) Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms. PMID:27053132

  17. The Association of Religion Data Archives (ARDA): Online Research Data, Tools, and References

    PubMed Central

    Finke, Roger; Adamczyk, Amy

    2014-01-01

    The Association of Religion Data Archives (ARDA) currently archives over 400 local, national, and international data files, and offers a wide range of research tools to build surveys, preview data on-line, develop customized maps and reports of U.S. church membership, and examine religion differences across nations and regions of the world. The ARDA also supports reference and teaching tools that draw on the rich data archive. This research note offers a brief introduction to the quantitative data available for exploration or download, and a few of the website features most useful for research and teaching. Supported by the Lilly Endowment, the John Templeton Foundation, the Pennsylvania State University, and the Baylor Institute for Studies of Religion, all data downloads and online services are free of charge. PMID:25484914

  18. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    NASA Technical Reports Server (NTRS)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  19. The Biobanking Analysis Resource Catalogue (BARCdb): a new research tool for the analysis of biobank samples

    PubMed Central

    Galli, Joakim; Oelrich, Johan; Taussig, Michael J.; Andreasson, Ulrika; Ortega-Paino, Eva; Landegren, Ulf

    2015-01-01

    We report the development of a new database of technology services and products for analysis of biobank samples in biomedical research. BARCdb, the Biobanking Analysis Resource Catalogue (http://www.barcdb.org), is a freely available web resource, listing expertise and molecular resource capabilities of research centres and biotechnology companies. The database is designed for researchers who require information on how to make best use of valuable biospecimens from biobanks and other sample collections, focusing on the choice of analytical techniques and the demands they make on the type of samples, pre-analytical sample preparation and amounts needed. BARCdb has been developed as part of the Swedish biobanking infrastructure (BBMRI.se), but now welcomes submissions from service providers throughout Europe. BARCdb can help match resource providers with potential users, stimulating transnational collaborations and ensuring compatibility of results from different labs. It can promote a more optimal use of European resources in general, both with respect to standard and more experimental technologies, as well as for valuable biobank samples. This article describes how information on service and reagent providers of relevant technologies is made available on BARCdb, and how this resource may contribute to strengthening biomedical research in academia and in the biotechnology and pharmaceutical industries. PMID:25336620

  20. The Biobanking Analysis Resource Catalogue (BARCdb): a new research tool for the analysis of biobank samples.

    PubMed

    Galli, Joakim; Oelrich, Johan; Taussig, Michael J; Andreasson, Ulrika; Ortega-Paino, Eva; Landegren, Ulf

    2015-01-01

    We report the development of a new database of technology services and products for analysis of biobank samples in biomedical research. BARCdb, the Biobanking Analysis Resource Catalogue (http://www.barcdb.org), is a freely available web resource, listing expertise and molecular resource capabilities of research centres and biotechnology companies. The database is designed for researchers who require information on how to make best use of valuable biospecimens from biobanks and other sample collections, focusing on the choice of analytical techniques and the demands they make on the type of samples, pre-analytical sample preparation and amounts needed. BARCdb has been developed as part of the Swedish biobanking infrastructure (BBMRI.se), but now welcomes submissions from service providers throughout Europe. BARCdb can help match resource providers with potential users, stimulating transnational collaborations and ensuring compatibility of results from different labs. It can promote a more optimal use of European resources in general, both with respect to standard and more experimental technologies, as well as for valuable biobank samples. This article describes how information on service and reagent providers of relevant technologies is made available on BARCdb, and how this resource may contribute to strengthening biomedical research in academia and in the biotechnology and pharmaceutical industries. PMID:25336620

  1. The Scottish Government's Rural and Environmental Science and Analytical Services Strategic Research Progamme

    NASA Astrophysics Data System (ADS)

    Dawson, Lorna; Bestwick, Charles

    2013-04-01

    The Strategic Research Programme focuses on the delivery of outputs and outcomes within the major policy agenda areas of climate change, land use and food security, and to impact on the 'Wealthier', 'Healthier' and 'Greener' strategic objectives of the Scottish Government. The research is delivered through two programmes: 'Environmental Change' and 'Food, Land and People'; the core strength of which is the collaboration between the Scottish Government's Main Research Providers-The James Hutton Institute, the Moredun Research Institute, Rowett Institute of Nutrition and Health University of Aberdeen, Scotland's Rural College, Biomathematics and Statistics Scotland and The Royal Botanic Gardens Edinburgh. The research actively seeks to inform and be informed by stakeholders from policy, farming, land use, water and energy supply, food production and manufacturing, non-governmental organisations, voluntary organisations, community groups and general public. This presentation will provide an overview of the programme's interdisciplinary research, through examples from across the programme's themes. Examples will exemplify impact within the Strategic Programme's priorities of supporting policy and practice, contributing to economic growth and innovation, enhancing collaborative and multidisciplinary research, growing scientific resilience and delivering scientific excellence. http://www.scotland.gov.uk/Topics/Research/About/EBAR/StrategicResearch/future-research-strategy/Themes/ http://www.knowledgescotland.org/news.php?article_id=295

  2. Open Support Platform for Environmental Research (OSPER) - tools for the discovery and exploitation of environmental data

    NASA Astrophysics Data System (ADS)

    Dawes, N. M.; Lehning, M.; Bavay, M.; Sarni, S.; Iosifescu, I.; Gwadera, R.; Scipion, D. E.; Blanchet, J.; Davison, A.; Berne, A.; Hurni, L.; Parlange, M. B.; Aberer, K.

    2012-12-01

    The Open Support Platform for Environmental Research (OSPER) has been launched to take forward key data management components developed under the Swiss Experiment platform project to achieve improved usability and a wider scope. With this project, we aim to connect users to data and their context, an area identified during SwissEx as having the greatest potential impact on the research community. OSPER has a clear focus on providing the technology for data storage, management and exploitation with a special focus on data interoperability and documentation. In this presentation, we will demonstrate the key aims of OSPER for the period 2012 - 2015. Inheriting the basic SwissEx functionality, OSPER provides an excellent method of making data accessible via their metadata. One of the biggest differences between the OSPER infrastructure and other data platforms is the level of interaction that one can have with the data and the level of integration with the analysis tools used in science. We wish to capitalise on this advantage by increasing this integration and working with environmental research projects to develop the tools that make a difference to their daily research. The new data infrastructure will serve the following purposes: ● Open documentation, archiving and discovery of datasets. ● Facilitation of data sharing and collaboration (especially inter-disciplinary) with data owner controlled access rights, particularly concentrating on providing as much contextual information as possible. ● Improvements in ease of data access and combination of data sources. ● Tools for data visualisation and statistical and numerical data analysis with a focus on spatial data and trends. Key areas identified for development during OSPER are: ● New infrastructure and content for current WebGIS-based data visualisation system to create a publicly available platform. ● Provision of data in standard formats using standard methods as well as the consumption of such data

  3. SPIN query tools for de-identified research on a humongous database.

    PubMed

    McDonald, Clement J; Dexter, Paul; Schadow, Gunther; Chueh, Henry C; Abernathy, Greg; Hook, John; Blevins, Lonnie; Overhage, J Marc; Berman, Jules J

    2005-01-01

    The Shared Pathology Informatics Network (SPIN), a research initiative of the National Cancer Institute, will allow for the retrieval of more than 4 million pathology reports and specimens. In this paper, we describe the special query tool as developed for the Indianapolis/Regenstrief SPIN node, integrated into the ever-expanding Indiana Network for Patient care (INPC). This query tool allows for the retrieval of de-identified data sets using complex logic, auto-coded final diagnoses, and intrinsically supports multiple types of statistical analyses. The new SPIN/INPC database represents a new generation of the Regenstrief Medical Record system - a centralized, but federated system of repositories. PMID:16779093

  4. SPIN Query Tools for De-identified Research on a Humongous Database

    PubMed Central

    McDonald, Clement J.; Dexter, Paul; Schadow, Gunther; Chueh, Henry C.; Abernathy, Greg; Hook, John; Blevins, Lonnie; Overhage, J. Marc; Berman, Jules J.

    2005-01-01

    The Shared Pathology Informatics Network (SPIN), a research initiative of the National Cancer Institute, will allow for the retrieval of more than 4 million pathology reports and specimens. In this paper, we describe the special query tool as developed for the Indianapolis/Regenstrief SPIN node, integrated into the ever-expanding Indiana Network for Patient care (INPC). This query tool allows for the retrieval of de-identified data sets using complex logic, auto-coded final diagnoses, and intrinsically supports multiple types of statistical analyses. The new SPIN/INPC database represents a new generation of the Regenstrief Medical Record system – a centralized, but federated system of repositories. PMID:16779093

  5. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. PMID:20424421

  6. Dietary MicroRNA Database (DMD): An Archive Database and Analytic Tool for Food-Borne microRNAs

    PubMed Central

    Chiang, Kevin; Shu, Jiang; Zempleni, Janos; Cui, Juan

    2015-01-01

    With the advent of high throughput technology, a huge amount of microRNA information has been added to the growing body of knowledge for non-coding RNAs. Here we present the Dietary MicroRNA Databases (DMD), the first repository for archiving and analyzing the published and novel microRNAs discovered in dietary resources. Currently there are fifteen types of dietary species, such as apple, grape, cow milk, and cow fat, included in the database originating from 9 plant and 5 animal species. Annotation for each entry, a mature microRNA indexed as DM0000*, covers information of the mature sequences, genome locations, hairpin structures of parental pre-microRNAs, cross-species sequence comparison, disease relevance, and the experimentally validated gene targets. Furthermore, a few functional analyses including target prediction, pathway enrichment and gene network construction have been integrated into the system, which enable users to generate functional insights through viewing the functional pathways and building protein-protein interaction networks associated with each microRNA. Another unique feature of DMD is that it provides a feature generator where a total of 411 descriptive attributes can be calculated for any given microRNAs based on their sequences and structures. DMD would be particularly useful for research groups studying microRNA regulation from a nutrition point of view. The database can be accessed at http://sbbi.unl.edu/dmd/. PMID:26030752

  7. Dietary MicroRNA Database (DMD): An Archive Database and Analytic Tool for Food-Borne microRNAs.

    PubMed

    Chiang, Kevin; Shu, Jiang; Zempleni, Janos; Cui, Juan

    2015-01-01

    With the advent of high throughput technology, a huge amount of microRNA information has been added to the growing body of knowledge for non-coding RNAs. Here we present the Dietary MicroRNA Databases (DMD), the first repository for archiving and analyzing the published and novel microRNAs discovered in dietary resources. Currently there are fifteen types of dietary species, such as apple, grape, cow milk, and cow fat, included in the database originating from 9 plant and 5 animal species. Annotation for each entry, a mature microRNA indexed as DM0000*, covers information of the mature sequences, genome locations, hairpin structures of parental pre-microRNAs, cross-species sequence comparison, disease relevance, and the experimentally validated gene targets. Furthermore, a few functional analyses including target prediction, pathway enrichment and gene network construction have been integrated into the system, which enable users to generate functional insights through viewing the functional pathways and building protein-protein interaction networks associated with each microRNA. Another unique feature of DMD is that it provides a feature generator where a total of 411 descriptive attributes can be calculated for any given microRNAs based on their sequences and structures. DMD would be particularly useful for research groups studying microRNA regulation from a nutrition point of view. The database can be accessed at http://sbbi.unl.edu/dmd/. PMID:26030752

  8. Integrated Decision-Making Tool to Develop Spent Fuel Strategies for Research Reactors

    SciTech Connect

    Beatty, Randy L; Harrison, Thomas J

    2016-01-01

    IAEA Member States operating or having previously operated a Research Reactor are responsible for the safe and sustainable management and disposal of associated radioactive waste, including research reactor spent nuclear fuel (RRSNF). This includes the safe disposal of RRSNF or the corresponding equivalent waste returned after spent fuel reprocessing. One key challenge to developing general recommendations lies in the diversity of spent fuel types, locations and national/regional circumstances rather than mass or volume alone. This is especially true given that RRSNF inventories are relatively small, and research reactors are rarely operated at a high power level or duration typical of commercial power plants. Presently, many countries lack an effective long-term policy for managing RRSNF. This paper presents results of the International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) #T33001 on Options and Technologies for Managing the Back End of the Research Reactor Nuclear Fuel Cycle which includes an Integrated Decision Making Tool called BRIDE (Back-end Research reactor Integrated Decision Evaluation). This is a multi-attribute decision-making tool that combines the Total Estimated Cost of each life-cycle scenario with Non-economic factors such as public acceptance, technical maturity etc and ranks optional back-end scenarios specific to member states situations in order to develop a specific member state strategic plan with a preferred or recommended option for managing spent fuel from Research Reactors.

  9. Using Digital Video as a Research Tool: Ethical Issues for Researchers

    ERIC Educational Resources Information Center

    Schuck, Sandy; Kearney, Matthew

    2006-01-01

    Digital video and accompanying editing software are increasingly becoming more accessible for researchers in terms of ease of use and cost. The rich, visually appealing and seductive nature of video-based data can convey a strong sense of direct experience with the phenomena studied (Pea, 1999). However, the ease of selection and editing of…

  10. A New Tool for Identifying Research Standards and Evaluating Research Performance

    ERIC Educational Resources Information Center

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  11. About Skinner and time: behavior-analytic contributions to research on animal timing.

    PubMed

    Lejeune, Helga; Richelle, Marc; Wearden, J H

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in The Behavior of Organisms, through the rate differentiation procedures of Schedules of Reinforcement, to modern temporal psychophysics in animals. The second influence has been the development of accounts of animal timing that have tried to avoid reference to internal processes of a cognitive sort, in particular internal clock mechanisms. Skinner's early discussion of temporal control is first reviewed, and then three recent theories-Killeen & Fetterman's (1988) Behavioral Theory of Timing; Machado's (1997) Learning to Time; and Dragoi, Staddon, Palmer, & Buhusi's (2003) Adaptive Timer Model-are discussed and evaluated. PMID:16602380

  12. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  13. Conceptual framework for outcomes research studies of hepatitis C: an analytical review.

    PubMed

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  14. A Research Analytics Framework-Supported Recommendation Approach for Supervisor Selection

    ERIC Educational Resources Information Center

    Zhang, Mingyu; Ma, Jian; Liu, Zhiying; Sun, Jianshan; Silva, Thushari

    2016-01-01

    Identifying a suitable supervisor for a new research student is vitally important for his or her academic career. Current information overload and information disorientation have posed significant challenges for new students. Existing research for supervisor identification focuses on quality assessment of candidates, but ignores indirect relevance…

  15. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  16. Identifying Key Priorities for Future Palliative Care Research Using an Innovative Analytic Approach

    PubMed Central

    Riffin, Catherine; Pillemer, Karl; Chen, Emily K.; Warmington, Marcus; Adelman, Ronald D.; Reid, M. C.

    2015-01-01

    Using an innovative approach, we identified research priorities in palliative care to guide future research initiatives. We searched 7databases (2005–2012) for review articles published on the topics of palliative and hospice–end-of-life care. The identified research recommendations (n = 648) fell into 2 distinct categories: (1) ways to improve methodological approaches and (2) specific topic areas in need of future study. The most commonly cited priority within the theme of methodological approaches was the need for enhanced rigor. Specific topics in need of future study included perspectives and needs of patients, relatives, and providers; underrepresented populations; decision-making; cost-effectiveness; provider education; spirituality; service use; and inter-disciplinary approaches to delivering palliative care. This review underscores the need for additional research on specific topics and methodologically rigorous research to inform health policy and practice. PMID:25393169

  17. Using GeoMapApp as an Analytical Tool for the Journey From Data Visualization to Synthesis

    NASA Astrophysics Data System (ADS)

    Ryan, W. B.; Coplan, J. O.; Melkonian, A. K.; Carbotte, S. M.

    2008-12-01

    The potential to explore and understand our world has forever changed since the appearance of the NASA World Wind and Google Earth virtual globes. Now, in the duration of a single breath, we can zoom from the planetary scale of an orbiting spacecraft down to a roadside outcrop and expose layers of information with different and rich contents. But how do we digest all this information into new knowledge that explains the processes that have shaped the land and oceans into their present configurations and behaviors? In our opinion we need to transition beyond visualization to interactive inquiry of multiple datasets across a span of expertise - from the classroom to the research laboratory. Although the virtual globe enables an unprecedented means as revolutionary as the textural search engine to discover information, presently most data on the WEB are not adequately described with metadata to make the subsequent steps of analysis productive. We have begun to address this limitation by linking GeoMapApp to databases in the earth and ocean sciences where content has been vetted for thoroughness, accuracy and global coverage. With structure in the content, the virtual globe can then manipulate these databases in what if? exercises, compare the various attributes of a dataset with each other via graphs and symbols, and correlate results across different scientific domains. We will show examples of such data integration using the results of four decades of ocean floor drilling, the focal mechanisms from thousands of earthquakes, and the chemistry of the volcanic bedrock along the crest of the mid-ocean ridge. A synthesis of ocean drilling shows the dependency of the sediment and faunal content on bedrock age, subsidence history and plate motions relative to the past equator and deserts. A synthesis of earthquake rupture shows focal mechanism dependency on segmentation of the plate boundaries. Patterns in the chemistry of erupted lava are intricately related to the fine

  18. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    PubMed Central

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, p<0.00001 and (F (1, 41) = 31.61, p<0.00001 respectively. Conclusions Our locally developed multimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065

  19. Community-based participatory research as a tool to advance environmental health sciences.

    PubMed Central

    O'Fallon, Liam R; Dearry, Allen

    2002-01-01

    The past two decades have witnessed a rapid proliferation of community-based participatory research (CBPR) projects. CBPR methodology presents an alternative to traditional population-based biomedical research practices by encouraging active and equal partnerships between community members and academic investigators. The National Institute of Environmental Health Sciences (NIEHS), the premier biomedical research facility for environmental health, is a leader in promoting the use of CBPR in instances where community-university partnerships serve to advance our understanding of environmentally related disease. In this article, the authors highlight six key principles of CBPR and describe how these principles are met within specific NIEHS-supported research investigations. These projects demonstrate that community-based participatory research can be an effective tool to enhance our knowledge of the causes and mechanisms of disorders having an environmental etiology, reduce adverse health outcomes through innovative intervention strategies and policy change, and address the environmental health concerns of community residents. PMID:11929724

  20. A new research tool for hybrid Bayesian networks using script language

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Park, Cheol Young; Carvalho, Rommel

    2011-06-01

    While continuous variables become more and more inevitable in Bayesian networks for modeling real-life applications in complex systems, there are not much software tools to support it. Popular commercial Bayesian network tools such as Hugin, and Netica etc., are either expensive or have to discretize continuous variables. In addition, some free programs existing in the literature, commonly known as BNT, GeNie/SMILE, etc, have their own advantages and disadvantages respectively. In this paper, we introduce a newly developed Java tool for model construction and inference for hybrid Bayesian networks. Via the representation power of the script language, this tool can build the hybrid model automatically based on a well defined string that follows the specific grammars. Furthermore, it implements several inference algorithms capable to accommodate hybrid Bayesian networks, including Junction Tree algorithm (JT) for conditional linear Gaussian model (CLG), and Direct Message Passing (DMP) for general hybrid Bayesian networks with CLG structure. We believe this tool will be useful for researchers in the field.

  1. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  2. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  3. Automated Tools for Clinical Research Data Quality Control using NCI Common Data Elements

    PubMed Central

    Hudson, Cody L.; Topaloglu, Umit; Bian, Jiang; Hogan, William; Kieber-Emmons, Thomas

    2014-01-01

    Clinical research data generated by a federation of collection mechanisms and systems often produces highly dissimilar data with varying quality. Poor data quality can result in the inefficient use of research data or can even require the repetition of the performed studies, a costly process. This work presents two tools for improving data quality of clinical research data relying on the National Cancer Institute’s Common Data Elements as a standard representation of possible questions and data elements to A: automatically suggest CDE annotations for already collected data based on semantic and syntactic analysis utilizing the Unified Medical Language System (UMLS) Terminology Services’ Metathesaurus and B: annotate and constrain new clinical research questions though a simple-to-use “CDE Browser.” In this work, these tools are built and tested on the open-source LimeSurvey software and research data analyzed and identified to contain various data quality issues captured by the Comprehensive Research Informatics Suite (CRIS) at the University of Arkansas for Medical Sciences. PMID:25717402

  4. The need for novel informatics tools for integrating and planning research in molecular and cellular cognition.

    PubMed

    Silva, Alcino J; Müller, Klaus-Robert

    2015-09-01

    The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other biology fields. Additionally, the multilevel integration process characteristic of this field involves the establishment of experimental connections between molecular, electrophysiological, behavioral, and even cognitive data. This multidisciplinary integration process requires strategies and approaches that originate in several different fields, which greatly increases the complexity and demands of this process. Although causal assertions, where phenomenon A is thought to contribute or relate to B, are at the center of this integration process and key to research in biology, there are currently no tools to help scientists keep track of the increasingly more complex network of causal connections they use when making research decisions. Here, we propose the development of semiautomated graphical and interactive tools to help neuroscientists and other biologists, including those working in molecular and cellular cognition, to track, map, and weight causal evidence in research papers. There is a great need for a concerted effort by biologists, computer scientists, and funding institutions to develop maps of causal information that would aid in integration of research findings and in experiment planning. PMID:26286658

  5. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.

  6. CRISPR: a versatile tool for both forward and reverse genetics research.

    PubMed

    Gurumurthy, Channabasavaiah B; Grati, M'hamed; Ohtsuka, Masato; Schilit, Samantha L P; Quadros, Rolen M; Liu, Xue Zhong

    2016-09-01

    Human genetics research employs the two opposing approaches of forward and reverse genetics. While forward genetics identifies and links a mutation to an observed disease etiology, reverse genetics induces mutations in model organisms to study their role in disease. In most cases, causality for mutations identified by forward genetics is confirmed by reverse genetics through the development of genetically engineered animal models and an assessment of whether the model can recapitulate the disease. While many technological advances have helped improve these approaches, some gaps still remain. CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated), which has emerged as a revolutionary genetic engineering tool, holds great promise for closing such gaps. By combining the benefits of forward and reverse genetics, it has dramatically expedited human genetics research. We provide a perspective on the power of CRISPR-based forward and reverse genetics tools in human genetics and discuss its applications using some disease examples. PMID:27384229

  7. Exploring Assessment Tools for Research and Evaluation in Astronomy Education and Outreach

    NASA Astrophysics Data System (ADS)

    Buxner, S. R.; Wenger, M. C.; Dokter, E. F. C.

    2011-09-01

    The ability to effectively measure knowledge, attitudes, and skills in formal and informal educational settings is an important aspect of astronomy education research and evaluation. Assessments may take the form of interviews, observations, surveys, exams, or other probes to help unpack people's understandings or beliefs. In this workshop, we discussed characteristics of a variety of tools that exist to assess understandings of different concepts in astronomy as well as attitudes towards science and science teaching; these include concept inventories, surveys, interview protocols, observation protocols, card sorting, reflection videos, and other methods currently being used in astronomy education research and EPO program evaluations. In addition, we discussed common questions in the selection of assessment tools including issues of reliability and validity, time to administer, format of implementation, analysis, and human subject concerns.

  8. A semi-automatic web based tool for the selection of research projects reviewers.

    PubMed

    Pupella, Valeria; Monteverde, Maria Eugenia; Lombardo, Claudio; Belardelli, Filippo; Giacomini, Mauro

    2014-01-01

    The correct evaluation of research proposals continues today to be problematic, and in many cases, grants and fellowships are subjected to this type of assessment. A web based semi-automatic tool to help in the selection of reviewers was developed. The core of the proposed system is the matching of the MeSH Descriptors of the publications submitted by the reviewers (for their accreditation) and the Descriptor linked to the research keywords, which were selected. Moreover, a citation related index was further calculated and adopted in order to discard not suitable reviewers. This tool was used as a support in a web site for the evaluation of candidates applying for a fellowship in the oncology field. PMID:25160328

  9. Participant observation as a research tool in a practice based profession.

    PubMed

    Kennedy, C

    1999-10-01

    This paper by Catriona Kennedy offers a personal account of the use of participant observation as a tool for exploring and uncovering the knowledge base of experienced district nurses in relation to first assessment visits. Currently, many district nurses (DNs) are educated to degree level. However, despite a long history of educational provision for DNs, the research base available to support their practice is limited. PMID:26954027

  10. Using robotics construction kits as metacognitive tools: a research in an Italian primary school.

    PubMed

    La Paglia, Filippo; Caci, Barbara; La Barbera, Daniele; Cardaci, Maurizio

    2010-01-01

    The present paper is aimed at analyzing the process of building and programming robots as a metacognitive tool. Quantitative data and qualitative observations from a research performed in a sample of children attending an Italian primary school are described in this work. Results showed that robotics activities may be intended as a new metacognitive environment that allows children to monitor themselves and control their learning actions in an autonomous and self-centered way. PMID:20543280

  11. Open Virtual Worlds as Pedagogical Research Tools: Learning from the Schome Park Programme

    NASA Astrophysics Data System (ADS)

    Twining, Peter; Peachey, Anna

    This paper introduces the term Open Virtual Worlds and argues that they are ‘unclaimed educational spaces’, which provide a valuable tool for researching pedagogy. Having explored these claims the way in which Teen Second Life® virtual world was used for pedagogical experimentation in the initial phases of the Schome Park Programme is described. Four sets of pedagogical dimensions that emerged are presented and illustrated with examples from the Schome Park Programme.

  12. DataUp: A tool to help researchers describe and share tabular data

    PubMed Central

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012. PMID:25653834

  13. DataUp: A tool to help researchers describe and share tabular data.

    PubMed

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012. PMID:25653834

  14. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    PubMed

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring. PMID:24287172

  15. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement

    PubMed Central

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide ‘data warehouses’ to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with ‘isolation’ orders, or to determine patients’ eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time ‘pick off’ tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring. PMID:24287172

  16. CCMC: Serving research and space weather communities with unique space weather services, innovative tools and resources

    NASA Astrophysics Data System (ADS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti; Maddox, Marlo

    2015-04-01

    With the addition of Space Weather Research Center (a sub-team within CCMC) in 2010 to address NASA’s own space weather needs, CCMC has become a unique entity that not only facilitates research through providing access to the state-of-the-art space science and space weather models, but also plays a critical role in providing unique space weather services to NASA robotic missions, developing innovative tools and transitioning research to operations via user feedback. With scientists, forecasters and software developers working together within one team, through close and direct connection with space weather customers and trusted relationship with model developers, CCMC is flexible, nimble and effective to meet customer needs. In this presentation, we highlight a few unique aspects of CCMC/SWRC’s space weather services, such as addressing space weather throughout the solar system, pushing the frontier of space weather forecasting via the ensemble approach, providing direct personnel and tool support for spacecraft anomaly resolution, prompting development of multi-purpose tools and knowledge bases, and educating and engaging the next generation of space weather scientists.

  17. The virtual supermarket: An innovative research tool to study consumer food purchasing behaviour

    PubMed Central

    2011-01-01

    Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The

  18. Low Cost Electroencephalographic Acquisition Amplifier to serve as Teaching and Research Tool

    PubMed Central

    Jain, Ankit; Kim, Insoo; Gluckman, Bruce J.

    2012-01-01

    We describe the development and testing of a low cost, easily constructed electroencephalographic acquisition amplifier for noninvasive Brain Computer Interface (BCI) education and research. The acquisition amplifier is constructed from newly available off-the-shelf integrated circuit components, and readily sends a 24-bit data stream via USB bus to a computer platform. We demonstrate here the hardware’s use in the analysis of a visually evoked P300 paradigm for a choose one-of-eight task. This clearly shows the applicability of this system as a low cost teaching and research tool. PMID:22254699

  19. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification

    NASA Astrophysics Data System (ADS)

    Bates, Matthew E.; Keisler, Jeffrey M.; Zussblatt, Niels P.; Plourde, Kenton J.; Wender, Ben A.; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis—methods commonly applied in financial and operations management—to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios—combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

  20. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification.

    PubMed

    Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results. PMID:26551015

  1. Research on analytical model and design formulas of permanent magnetic bearings based on Halbach array with arbitrary segmented magnetized angle

    NASA Astrophysics Data System (ADS)

    Wang, Nianxian; Wang, Dongxiong; Chen, Kuisheng; Wu, Huachun

    2016-07-01

    The bearing capacity of permanent magnetic bearings can be improved efficiently by using the Halbach array magnetization. However, the research on analytical model of Halbach array PMBs with arbitrary segmented magnetized angle has not been developed. The application of Halbach array PMBs has been limited by the absence of the analytical model and design formulas. In this research, the Halbach array PMBs with arbitrary segmented magnetized angle has been studied. The magnetization model of bearings is established. The magnetic field distribution model of the permanent magnet array is established by using the scalar magnetic potential model. On the basis of this, the bearing force model and the bearing stiffness model of the PMBs are established based on the virtual displacement method. The influence of the pair of magnetic rings in one cycle and the structure parameters of PMBs on the maximal bearing capacity and support stiffness characteristics are studied. The reference factors for the design process of PMBs have been given. Finally, the theoretical model and the conclusions are verified by the finite element analysis.

  2. [Research and application progress of near infrared spectroscopy analytical technology in China in the past five years].

    PubMed

    Chu, Xiao-Li; Lu, Wan-Zhen

    2014-10-01

    In the past decade, near infrared spectroscopy (NIR) has expanded rapidly and been applied widely in many fields in China. The recent progress of the research and application of NIR analytical technology in China especially in the past five years has been reviewed. It includes hardware and software R&D, Chemometric algorithms and experimental methods research, and quantitative and qualitative applications in the typical fields such as food, agriculture, pharmaceuticals, petrochemicals, forestry, and medical diagnosis. 209 references are cited, which are mainly published in national journals, professional magazines, and book chapters. The developing trend of near infrared spectroscopy and the strategies to further promote its innovation and development in China in the near future are put forward and discussed. PMID:25739193

  3. Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom

    PubMed Central

    Council, Sarah E.; Horvath, Julie E.

    2016-01-01

    The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects. PMID:27047587

  4. Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom.

    PubMed

    Council, Sarah E; Horvath, Julie E

    2016-03-01

    The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects. PMID:27047587

  5. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  6. Surveying the Effect of Media Effects: A Meta-Analytic Summary of Media Effects Research in "Human Communication Research."

    ERIC Educational Resources Information Center

    Emmers-Sommer, Tara M.; Allen, Mike

    1999-01-01

    Analyzes the media-effects research published in this journal during the last 25 years via meta-analysis. Finds that, as children age, they better understand media messages; mass media are a significant source of learning; and media can influence attitudes. Discusses political, social, and educational implications, as well as implications for…

  7. Metaphors and Drawings as Research Tools of Head Teachers' Perceptions on Their Management and Leadership Roles and Responsibilities

    ERIC Educational Resources Information Center

    Argyropoulou, Eleftheria; Hatira, Kalliopi

    2014-01-01

    This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…

  8. Deafness as a Natural Experiment: A Meta-Analytic Review of IQ Research.

    ERIC Educational Resources Information Center

    Braden, Jeffery P.

    The literature describing deaf persons' intelligence was subjected to a quantitative and qualitative review in this analysis of 85 studies containing independent samples of 43,177 deaf subjects. First, bibliometric analyses were conducted to define the scope, dissemination, and trajectory of the research investigating deafness and Intelligence…

  9. Matching Procedures in Autism Research: Evidence from Meta-Analytic Studies

    ERIC Educational Resources Information Center

    Shaked, Michal; Yirmiya, Nurit

    2004-01-01

    In this paper, we summarize some of our findings from a series of three meta-analyses and discuss their implications for autism research. In the first meta-analysis, we examined studies addressing the theory of mind hypothesis in autism. This analysis revealed that theory of mind disabilities are not unique to autism, although what may be unique…

  10. Academic Benefits of Peer Tutoring: A Meta-Analytic Review of Single-Case Research

    ERIC Educational Resources Information Center

    Bowman-Perrott, Lisa; Davis, Heather; Vannest, Kimberly; Williams, Lauren; Greenwood, Charles; Parker, Richard

    2013-01-01

    Peer tutoring is an instructional strategy that involves students helping each other learn content through repetition of key concepts. This meta-analysis examined effects of peer tutoring across 26 single-case research experiments for 938 students in Grades 1-12. The TauU effect size for 195 phase contrasts was 0.75 with a confidence interval of…

  11. Some Comments on Analytic Traditions in EFA As against CFA: An Analysis of Selected Research Reports.

    ERIC Educational Resources Information Center

    Kieffer, Kevin M.

    Factor analysis has historically been used for myriad purposes in the social and behavioral sciences, but an especially important application of this technique has been to evaluate construct validity. Since in the present milieu both exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are readily available to the researcher,…

  12. A Meta-Analytic Review of Research on Gender Differences in Sexuality, 1993-2007

    ERIC Educational Resources Information Center

    Petersen, Jennifer L.; Hyde, Janet Shibley

    2010-01-01

    In 1993 Oliver and Hyde conducted a meta-analysis on gender differences in sexuality. The current study updated that analysis with current research and methods. Evolutionary psychology, cognitive social learning theory, social structural theory, and the gender similarities hypothesis provided predictions about gender differences in sexuality. We…

  13. Three Analytical Approaches for Predicting Enrollment at a Growing Metropolitan Research University. AIR 2002 Forum Paper.

    ERIC Educational Resources Information Center

    Armacost, Robert L.; Wilson, Alicia L.

    In a large metropolitan research university, multiple enrollment models are required to fulfill the needs of the many constituents and planning horizons. In addition, the method of predicting enrollment in a growth environment differs from that of universities in a stable environment. Three models for predicting enrollment are discussed, with a…

  14. Paying the Price for "Sugar and Spice": Shifting the Analytical Lens in Equity Research.

    ERIC Educational Resources Information Center

    Boaler, Jo

    2002-01-01

    Considers some of the scholarship on gender and mathematics, critically examining the findings that were produced and the influence they had. Proposes a fundamental tension in research on equity between lack of concern on the one hand and essentialism on the other. (Author/MM)

  15. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  16. Approaches, tools and methods used for setting priorities in health research in the 21st century

    PubMed Central

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (<1%). About 3% of studies reported no clear process and provided very little information on how priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is

  17. Helping to drive the robustness of preclinical research – the assay capability tool

    PubMed Central

    Gore, Katrina; Stanley, Phil

    2015-01-01

    Numerous articles in Nature, Science, Pharmacology Research and Perspectives, and other biomedical research journals over the past decade have highlighted that research is plagued by findings that are not reliable and cannot be reproduced. Poor experiments can occur, in part, as a consequence of inadequate statistical thinking in the experimental design, conduct and analysis. As it is not feasible for statisticians to be involved in every preclinical experiment many of the same journals have published guidelines on good statistical practice. Here, we outline a tool that addresses the root causes of irreproducibility in preclinical research in the pharmaceutical industry. The Assay Capability Tool uses 13 questions to guide scientists and statisticians during the development of in vitro and in vivo assays. It promotes the absolutely essential experimental design and analysis strategies and documents the strengths, weaknesses, and precision of an assay. However, what differentiates it from other proposed solutions is the emphasis on how the resulting data will be used. An assay can be assigned a low, medium, or high rating to indicate the level of confidence that can be afforded when making important decisions using data from that assay. This provides transparency on the appropriate interpretation of the assay's results in the light of its current capability. We suggest that following a well-defined process during assay development and use such as that laid out within the Assay Capability Tool means that whatever the results, positive or negative, a researcher can have confidence to make decisions upon and publish their findings. PMID:26236488

  18. Helping to drive the robustness of preclinical research - the assay capability tool.

    PubMed

    Gore, Katrina; Stanley, Phil

    2015-08-01

    Numerous articles in Nature, Science, Pharmacology Research and Perspectives, and other biomedical research journals over the past decade have highlighted that research is plagued by findings that are not reliable and cannot be reproduced. Poor experiments can occur, in part, as a consequence of inadequate statistical thinking in the experimental design, conduct and analysis. As it is not feasible for statisticians to be involved in every preclinical experiment many of the same journals have published guidelines on good statistical practice. Here, we outline a tool that addresses the root causes of irreproducibility in preclinical research in the pharmaceutical industry. The Assay Capability Tool uses 13 questions to guide scientists and statisticians during the development of in vitro and in vivo assays. It promotes the absolutely essential experimental design and analysis strategies and documents the strengths, weaknesses, and precision of an assay. However, what differentiates it from other proposed solutions is the emphasis on how the resulting data will be used. An assay can be assigned a low, medium, or high rating to indicate the level of confidence that can be afforded when making important decisions using data from that assay. This provides transparency on the appropriate interpretation of the assay's results in the light of its current capability. We suggest that following a well-defined process during assay development and use such as that laid out within the Assay Capability Tool means that whatever the results, positive or negative, a researcher can have confidence to make decisions upon and publish their findings. PMID:26236488

  19. Tool for evaluating research implementation challenges: A sense-making protocol for addressing implementation challenges in complex research settings

    PubMed Central

    2013-01-01

    Background Many challenges arise in complex organizational interventions that threaten research integrity. This article describes a Tool for Evaluating Research Implementation Challenges (TECH), developed using a complexity science framework to assist research teams in assessing and managing these challenges. Methods During the implementation of a multi-site, randomized controlled trial (RCT) of organizational interventions to reduce resident falls in eight nursing homes, we inductively developed, and later codified the TECH. The TECH was developed through processes that emerged from interactions among research team members and nursing home staff participants, including a purposive use of complexity science principles. Results The TECH provided a structure to assess challenges systematically, consider their potential impact on intervention feasibility and fidelity, and determine actions to take. We codified the process into an algorithm that can be adopted or adapted for other research projects. We present selected examples of the use of the TECH that are relevant to many complex interventions. Conclusions Complexity theory provides a useful lens through which research procedures can be developed to address implementation challenges that emerge from complex organizations and research designs. Sense-making is a group process in which diverse members interpret challenges when available information is ambiguous; the groups’ interpretations provide cues for taking action. Sense-making facilitates the creation of safe environments for generating innovative solutions that balance research integrity and practical issues. The challenges encountered during implementation of complex interventions are often unpredictable; however, adoption of a systematic process will allow investigators to address them in a consistent yet flexible manner, protecting fidelity. Research integrity is also protected by allowing for appropriate adaptations to intervention protocols that

  20. Extending the XNAT archive tool for image and analysis management in ophthalmology research

    NASA Astrophysics Data System (ADS)

    Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.

    2013-03-01

    In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.

  1. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    NASA Technical Reports Server (NTRS)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  2. VoiceThread as a Peer Review and Dissemination Tool for Undergraduate Research

    NASA Astrophysics Data System (ADS)

    Guertin, L. A.

    2012-12-01

    VoiceThread has been utilized in an undergraduate research methods course for peer review and final research project dissemination. VoiceThread (http://www.voicethread.com) can be considered a social media tool, as it is a web-based technology with the capacity to enable interactive dialogue. VoiceThread is an application that allows a user to place a media collection online containing images, audio, videos, documents, and/or presentations in an interface that facilitates asynchronous communication. Participants in a VoiceThread can be passive viewers of the online content or engaged commenters via text, audio, video, with slide annotations via a doodle tool. The VoiceThread, which runs across browsers and operating systems, can be public or private for viewing and commenting and can be embedded into any website. Although few university students are aware of the VoiceThread platform (only 10% of the students surveyed by Ng (2012)), the 2009 K-12 edition of The Horizon Report (Johnson et al., 2009) lists VoiceThread as a tool to watch because of the opportunities it provides as a collaborative learning environment. In Fall 2011, eleven students enrolled in an undergraduate research methods course at Penn State Brandywine each conducted their own small-scale research project. Upon conclusion of the projects, students were required to create a poster summarizing their work for peer review. To facilitate the peer review process outside of class, each student-created PowerPoint file was placed in a VoiceThread with private access to only the class members and instructor. Each student was assigned to peer review five different student posters (i.e., VoiceThread images) with the audio and doodle tools to comment on formatting, clarity of content, etc. After the peer reviews were complete, the students were allowed to edit their PowerPoint poster files for a new VoiceThread. In the new VoiceThread, students were required to video record themselves describing their research

  3. 15N techniques and analytical procedures. Indo/U. S. science and technology initiative. Research report

    SciTech Connect

    Porter, L.K.; Mosier, A.R.

    1992-05-01

    (15)N technology is used to explore many agricultural research topics, including the movement of nitrates to groundwater, use of fertilizer nitrogen by plants, ways to increase nitrogen fixation, and effects of management practices on denitrification. The publication reviews (15)N procedures and methods for handling and collecting samples, introducing isotopes into plants and soils, and for performing Kjeldahl analyses, isotope dilutions, Rittenberg oxidation conversions for isotope-ration analyses, and automated Dumas isotope-ratio analyses.

  4. Science in the Eyes of Preschool Children: Findings from an Innovative Research Tool

    NASA Astrophysics Data System (ADS)

    Dubosarsky, Mia D.

    How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.

  5. Evaluation of manometric temperature measurement (MTM), a process analytical technology tool in freeze drying, part III: heat and mass transfer measurement.

    PubMed

    Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J

    2006-01-01

    This article evaluates the procedures for determining the vial heat transfer coefficient and the extent of primary drying through manometric temperature measurement (MTM). The vial heat transfer coefficients (Kv) were calculated from the MTM-determined temperature and resistance and compared with Kv values determined by a gravimetric method. The differences between the MTM vial heat transfer coefficients and the gravimetric values are large at low shelf temperature but smaller when higher shelf temperatures were used. The differences also became smaller at higher chamber pressure and smaller when higher resistance materials were being freeze-dried. In all cases, using thermal shields greatly improved the accuracy of the MTM Kv measurement. With use of thermal shields, the thickness of the frozen layer calculated from MTM is in good agreement with values obtained gravimetrically. The heat transfer coefficient "error" is largely a direct result of the error in the dry layer resistance (ie, MTM-determined resistance is too low). This problem can be minimized if thermal shields are used for freeze-drying. With suitable use of thermal shields, accurate Kv values are obtained by MTM; thus allowing accurate calculations of heat and mass flow rates. The extent of primary drying can be monitored by real-time calculation of the amount of remaining ice using MTM data, thus providing a process analytical tool that greatly improves the freeze-drying process design and control. PMID:17285746

  6. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool. PMID:27435199

  7. Real simulation tools in introductory courses: packaging and repurposing our research code.

    NASA Astrophysics Data System (ADS)

    Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.

    2015-12-01

    Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.

  8. sRNAtoolbox: an integrated collection of small RNA research tools.

    PubMed

    Rueda, Antonio; Barturen, Guillermo; Lebrón, Ricardo; Gómez-Martín, Cristina; Alganza, Ángel; Oliver, José L; Hackenberg, Michael

    2015-07-01

    Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulators of gene expression, other types of functional small RNA molecules have been reported in animals and plants. MicroRNAs are important in host-microbe interactions and parasite microRNAs might modulate the innate immunity of the host. Furthermore, small RNAs can be detected in bodily fluids making them attractive non-invasive biomarker candidates. Given the general broad interest in small RNAs, and in particular microRNAs, a large number of bioinformatics aided analysis types are needed by the scientific community. To facilitate integrated sRNA research, we developed sRNAtoolbox, a set of independent but interconnected tools for expression profiling from high-throughput sequencing data, consensus differential expression, target gene prediction, visual exploration in a genome context as a function of read length, gene list analysis and blast search of unmapped reads. All tools can be used independently or for the exploration and downstream analysis of sRNAbench results. Workflows like the prediction of consensus target genes of parasite microRNAs in the host followed by the detection of enriched pathways can be easily established. The web-interface interconnecting all these tools is available at http://bioinfo5.ugr.es/srnatoolbox. PMID:26019179

  9. sRNAtoolbox: an integrated collection of small RNA research tools

    PubMed Central

    Rueda, Antonio; Barturen, Guillermo; Lebrón, Ricardo; Gómez-Martín, Cristina; Alganza, Ángel; Oliver, José L.; Hackenberg, Michael

    2015-01-01

    Small RNA research is a rapidly growing field. Apart from microRNAs, which are important regulators of gene expression, other types of functional small RNA molecules have been reported in animals and plants. MicroRNAs are important in host-microbe interactions and parasite microRNAs might modulate the innate immunity of the host. Furthermore, small RNAs can be detected in bodily fluids making them attractive non-invasive biomarker candidates. Given the general broad interest in small RNAs, and in particular microRNAs, a large number of bioinformatics aided analysis types are needed by the scientific community. To facilitate integrated sRNA research, we developed sRNAtoolbox, a set of independent but interconnected tools for expression profiling from high-throughput sequencing data, consensus differential expression, target gene prediction, visual exploration in a genome context as a function of read length, gene list analysis and blast search of unmapped reads. All tools can be used independently or for the exploration and downstream analysis of sRNAbench results. Workflows like the prediction of consensus target genes of parasite microRNAs in the host followed by the detection of enriched pathways can be easily established. The web-interface interconnecting all these tools is available at http://bioinfo5.ugr.es/srnatoolbox PMID:26019179

  10. PathCase-SB: integrating data sources and providing tools for systems biology research

    PubMed Central

    2012-01-01

    Background Integration of metabolic pathways resources and metabolic network models, and deploying new tools on the integrated platform can help perform more effective and more efficient systems biology research on understanding the regulation of metabolic networks. Therefore, the tasks of (a) integrating under a single database environment regulatory metabolic networks and existing models, and (b) building tools to help with modeling and analysis are desirable and intellectually challenging computational tasks. Results PathCase Systems Biology (PathCase-SB) is built and released. This paper describes PathCase-SB user interfaces developed to date. The current PathCase-SB system provides a database-enabled framework and web-based computational tools towards facilitating the development of kinetic models for biological systems. PathCase-SB aims to integrate systems biology models data and metabolic network data of selected biological data sources on the web (currently, BioModels Database and KEGG, respectively), and to provide more powerful and/or new capabilities via the new web-based integrative framework. Conclusions Each of the current four PathCase-SB interfaces, namely, Browser, Visualization, Querying, and Simulation interfaces, have expanded and new capabilities as compared with the original data sources. PathCase-SB is already available on the web and being used by researchers across the globe. PMID:22697505

  11. Participant Satisfaction With a Preference-Setting Tool for the Return of Individual Research Results in Pediatric Genomic Research.

    PubMed

    Holm, Ingrid A; Iles, Brittany R; Ziniel, Sonja I; Bacon, Phoebe L; Savage, Sarah K; Christensen, Kurt D; Weitzman, Elissa R; Green, Robert C; Huntington, Noelle L

    2015-10-01

    The perceived benefit of return of individual research results (IRRs) in accordance to participants' preferences in genomic biobank research is unclear. We developed an online preference-setting tool for return of IRRs based on the preventability and severity of a condition, which included an opt-out option for IRRs for mental illness, developmental disorders, childhood-onset degenerative conditions, and adult-onset conditions. Parents of patients <18 years of age at Boston Children's Hospital were randomized to the hypothetical scenario that their child was enrolled in one of four biobanks with different policies for IRRs to receive (a) "None," (b) "All," (c) "Binary"--choice to receive all or none, and (d) "Granular"--use the preference-setting tool to choose categories of IRRs. Parents were given a hypothetical IRRs report for their child. The survey was sent to 11,391 parents and completed by 2,718. The Granular group was the most satisfied with the process, biobank, and hypothetical IRRs received. The None group was least satisfied and least likely to agree that the biobank was beneficial (p < .001). The response to the statement that the biobank was harmful was not different between groups. Our data suggest that the ability to designate preferences leads to greater satisfaction and may increase biobank participation. PMID:26376753

  12. Systems thinking tools as applied to community-based participatory research: a case study.

    PubMed

    BeLue, Rhonda; Carmack, Chakema; Myers, Kyle R; Weinreb-Welch, Laurie; Lengerich, Eugene J

    2012-12-01

    Community-based participatory research (CBPR) is being used increasingly to address health disparities and complex health issues. The authors propose that CBPR can benefit from a systems science framework to represent the complex and dynamic characteristics of a community and identify intervention points and potential "tipping points." Systems science refers to a field of study that posits a holistic framework that is focused on component parts of a system in the context of relationships with each other and with other systems. Systems thinking tools can assist in intervention planning by allowing all CBPR stakeholders to visualize how community factors are interrelated and by potentially identifying the most salient intervention points. To demonstrate the potential utility of systems science tools in CBPR, the authors show the use of causal loop diagrams by a community coalition engaged in CBPR activities regarding youth drinking reduction and prevention. PMID:22467637

  13. SIMS ion microscopy as a novel, practical tool for subcellular chemical imaging in cancer research

    NASA Astrophysics Data System (ADS)

    Chandra, S.

    2003-01-01

    The development of cryogenic sample preparations, subcellular image quantification schemes, and correlative confocal laser scanning microscopy and ion microscopy have made dynamic SIMS a versatile tool in biology and medicine. For example, ion microscopy can provide much needed, novel information on calcium influx and intracellular calcium stores at organelle resolution in normal and transformed cells in order to better understand the altered calcium signaling in malignant cells. 3-D SIMS imaging of cells revealed dynamic gradients of calcium in cells undergoing mitosis and cytokinesis. Studies of subcellular localization of anticancer drugs is another area of research where ion microscopy can provide novel observations in many types of cancers. Ion microscopy is already an essential tool in boron neutron capture therapy (BNCT) of brain cancer as it can be used to quantitatively image the subcellular location of boron in cells and tissues. This information is critically needed for testing the efficacy of boronated agents and for calculations of radiation dosimetry.

  14. From "Inspiration-driven" Research to "Industrial-strength" Research: Applying User-developed Climate Analytics at Large scale

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, A.; Mason, E. E.; Langenhorst, A. R.; Balaji, V.; Nikonov, S.

    2014-12-01

    Numerous climate models, several parameters output from a vast range of climate scenarios -- most likely motivates a climate scientist to analyze a suite of available data to research and address a plethora of scientific questions, eg. occurrence of El Niño events or simply validate and compute specialized metrics for a specific climate field. Providing a platform for our scientists to work with data from different models both in-house and extending a similar approach to the application of climate analysis on data from different modeling centers is a key goal that will be addressed in this presentation. Model intercomparison projects, Earth System Grid Federation and knowledge exchange within the climate science community have all enabled successful establishment of "data standards and controlled vocabulary" . This opens key possibilities to facilitate techniques used to "explore" dataset(s) in the Big-Data archive and perform climate analyses following a simple, standardized templated approach. A typical pattern of use would be where the scientist works with a few datasets interactively to refine and extract a signal of a particular climate phenomenon. At this point data access patterns are random, as the analysis is exploratory. We call this the "inspiration-driven" phase of research. Subsequently, the scientist would need to apply her analysis to a much wider set of data: different models and scenarios from CMIP5 for example. This can be thought of as the "industrial" phase of research. We provide a pathway for user-developed analyses to transition from inspiration to industry. We will illustrate techniques being adopted at GFDL to develop analysis through interactive computational exploration on selected data; Provide analysis capabilities in batch workflows (using: Flexible Runtime Environment) and also web-based with data exploration mechanisms tapped from GFDL's Curator infrastructure. Comparing climate data both at the inter and intra-laboratory level

  15. Development of a HIPAA-compliant environment for translational research data and analytics

    PubMed Central

    Bradford, Wayne; Hurdle, John F; LaSalle, Bernie; Facelli, Julio C

    2014-01-01

    High-performance computing centers (HPC) traditionally have far less restrictive privacy management policies than those encountered in healthcare. We show how an HPC can be re-engineered to accommodate clinical data while retaining its utility in computationally intensive tasks such as data mining, machine learning, and statistics. We also discuss deploying protected virtual machines. A critical planning step was to engage the university's information security operations and the information security and privacy office. Access to the environment requires a double authentication mechanism. The first level of authentication requires access to the university's virtual private network and the second requires that the users be listed in the HPC network information service directory. The physical hardware resides in a data center with controlled room access. All employees of the HPC and its users take the university's local Health Insurance Portability and Accountability Act training series. In the first 3 years, researcher count has increased from 6 to 58. PMID:23911553

  16. Development of a HIPAA-compliant environment for translational research data and analytics.

    PubMed

    Bradford, Wayne; Hurdle, John F; LaSalle, Bernie; Facelli, Julio C

    2014-01-01

    High-performance computing centers (HPC) traditionally have far less restrictive privacy management policies than those encountered in healthcare. We show how an HPC can be re-engineered to accommodate clinical data while retaining its utility in computationally intensive tasks such as data mining, machine learning, and statistics. We also discuss deploying protected virtual machines. A critical planning step was to engage the university's information security operations and the information security and privacy office. Access to the environment requires a double authentication mechanism. The first level of authentication requires access to the university's virtual private network and the second requires that the users be listed in the HPC network information service directory. The physical hardware resides in a data center with controlled room access. All employees of the HPC and its users take the university's local Health Insurance Portability and Accountability Act training series. In the first 3 years, researcher count has increased from 6 to 58. PMID:23911553

  17. Research on test techniques of fault forewarning and diagnosis for high-end CNC machine tool

    NASA Astrophysics Data System (ADS)

    Ren, Bin; Xu, Xiaoli

    2010-12-01

    With the progress of modern science and technique, the manufacturing industry becomes more and more complex and intelligent. It is the challenge for stable, safe running and economical efficiency of machining equipment such as high-quality numerical control because of its complex structure and integrated functions, and the potential faults are easy to happen. How to ensure the equipment runs stably and reliably becomes the key problem to improve the machining precision and efficiency. In order to prolong the average no-fault time, stable running and machining precision of numerical control, it is very important to make relative test and research on acquisition of data of numerical control sample and establishment of sample database. Take high-end CNC Machine Tool for example, the research on test techniques for data acquisition of sample of typical functional parts in CNC Machine Tool will be made and test condition will be set up; the test methods for sample acquisition on running state monitoring and fault forewarning and diagnosis of numerical control is determined; the test platform for typical functional parts of numerical control is established; the sample database is designed and the sample base and knowledge mode is made. The test and research provide key test techniques to disclosure dynamic performance of fault and precision degeneration, and analyze the impact factors to fault.

  18. Growth and Maturation in the Zebrafish, Danio Rerio: A Staging Tool for Teaching and Research

    PubMed Central

    Singleman, Corinna

    2014-01-01

    Abstract Zebrafish have been increasingly used as a teaching tool to enhance the learning of many biological concepts from genetics, development, and behavior to the understanding of the local watershed. Traditionally, in both research and teaching, zebrafish work has focused on embryonic stages; however, later stages, from larval through adulthood, are increasingly being examined. Defining developmental stages based on age is a problematic way to assess maturity, because many environmental factors, such as temperature, population density, and water quality, impact growth and maturation. Fish length and characterization of key external morphological traits are considered better markers for maturation state. While a number of staging series exist for zebrafish, here we present a simplified normalization table of post-embryonic maturation well suited to both educational and research use. Specifically, we utilize fish size and four easily identified external morphological traits (pigment pattern, tail fin, anal fin, and dorsal fin morphology) to describe three larval stages, a juvenile stage, and an adult stage. These simplified maturation standards will be a useful tool for both educational and research protocols. PMID:24979389

  19. The DEDUCE Guided Query tool: providing simplified access to clinical data for research and quality improvement.

    PubMed

    Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey

    2011-04-01

    In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. PMID:21130181

  20. Oxytocin and Vasopressin Agonists and Antagonists as Research Tools and Potential Therapeutics

    PubMed Central

    Manning, M; Misicka, A; Olma, A; Bankowski, K; Stoev, S; Chini, B; Durroux, T; Mouillac, B; Corbani, M; Guillon, G

    2012-01-01

    We recently reviewed the status of peptide and nonpeptide agonists and antagonists for the V1a, V1b and V2 receptors for arginine vasopressin (AVP) and the oxytocin receptor for oxytocin (OT). In the present review, we update the status of peptides and nonpeptides as: (i) research tools and (ii) therapeutic agents. We also present our recent findings on the design of fluorescent ligands for V1b receptor localisation and for OT receptor dimerisation. We note the exciting discoveries regarding two novel naturally occurring analogues of OT. Recent reports of a selective VP V1a agonist and a selective OT agonist point to the continued therapeutic potential of peptides in this field. To date, only two nonpeptides, the V2/V1a antagonist, conivaptan and the V2 antagonist tolvaptan have received Food and Drug Administration approval for clinical use. The development of nonpeptide AVP V1a, V1b and V2 antagonists and OT agonists and antagonists has recently been abandoned by Merck, Sanofi and Pfizer. A promising OT antagonist, Retosiban, developed at Glaxo SmithKline is currently in a Phase II clinical trial for the prevention of premature labour. A number of the nonpeptide ligands that were not successful in clinical trials are proving to be valuable as research tools. Peptide agonists and antagonists continue to be very widely used as research tools in this field. In this regard, we present receptor data on some of the most widely used peptide and nonpeptide ligands, as a guide for their use, especially with regard to receptor selectivity and species differences. PMID:22375852