Science.gov

Sample records for analytical tool research

  1. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    PubMed

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  2. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  3. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  4. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  5. NASTRAN as an analytical research tool for composite mechanics and composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  6. PRE-QAPP AGREEMENT (PQA) AND ANALYTICAL METHOD CHECKLISTS (AMCS): TOOLS FOR PLANNING RESEARCH PROJECTS

    EPA Science Inventory

    The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...

  7. Proteomics: analytical tools and techniques.

    PubMed

    MacCoss, M J; Yates, J R

    2001-09-01

    Scientists have long been interested in measuring the effects of different stimuli on protein expression and metabolism. Analytical methods are being developed for the automated separation, identification, and quantitation of all of the proteins within the cell. Soon, investigators will be able to observe the effects of an experiment on every protein (as opposed to a selected few). This review presents a discussion of recent technological advances in proteomics in addition to exploring current methodological limitations.

  8. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  9. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review.

  10. Aptamers: molecular tools for analytical applications.

    PubMed

    Mairal, Teresa; Ozalp, Veli Cengiz; Lozano Sánchez, Pablo; Mir, Mònica; Katakis, Ioanis; O'Sullivan, Ciara K

    2008-02-01

    Aptamers are artificial nucleic acid ligands, specifically generated against certain targets, such as amino acids, drugs, proteins or other molecules. In nature they exist as a nucleic acid based genetic regulatory element called a riboswitch. For generation of artificial ligands, they are isolated from combinatorial libraries of synthetic nucleic acid by exponential enrichment, via an in vitro iterative process of adsorption, recovery and reamplification known as systematic evolution of ligands by exponential enrichment (SELEX). Thanks to their unique characteristics and chemical structure, aptamers offer themselves as ideal candidates for use in analytical devices and techniques. Recent progress in the aptamer selection and incorporation of aptamers into molecular beacon structures will ensure the application of aptamers for functional and quantitative proteomics and high-throughput screening for drug discovery, as well as in various analytical applications. The properties of aptamers as well as recent developments in improved, time-efficient methods for their selection and stabilization are outlined. The use of these powerful molecular tools for analysis and the advantages they offer over existing affinity biocomponents are discussed. Finally the evolving use of aptamers in specific analytical applications such as chromatography, ELISA-type assays, biosensors and affinity PCR as well as current avenues of research and future perspectives conclude this review. PMID:17581746

  11. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  12. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  13. Using the Conceptual Change Model of Learning as An Analytic Tool in Researching Teacher Preparation for Student Diversity

    ERIC Educational Resources Information Center

    Larkin, Douglas

    2012-01-01

    Background/Context: In regard to preparing prospective teachers for diverse classrooms, the agenda for teacher education research has been primarily concerned with identifying desired outcomes and promising strategies. Scholarship in multicultural education has been crucial for identifying the knowledge, skills, and attitudes needed by teachers to…

  14. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  15. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  16. ATIVS: analytical tool for influenza virus surveillance.

    PubMed

    Liao, Yu-Chieh; Ko, Chin-Yu; Tsai, Ming-Hsin; Lee, Min-Shi; Hsiung, Chao A

    2009-07-01

    The WHO Global Influenza Surveillance Network has routinely performed genetic and antigenic analyses of human influenza viruses to monitor influenza activity. Although these analyses provide supporting data for the selection of vaccine strains, it seems desirable to have user-friendly tools to visualize the antigenic evolution of influenza viruses for the purpose of surveillance. To meet this need, we have developed a web server, ATIVS (Analytical Tool for Influenza Virus Surveillance), for analyzing serological data of all influenza viruses and hemagglutinin sequence data of human influenza A/H3N2 viruses so as to generate antigenic maps for influenza surveillance and vaccine strain selection. Functionalities are described and examples are provided to illustrate its usefulness and performance. The ATIVS web server is available at http://influenza.nhri.org.tw/ATIVS/.

  17. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  18. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  19. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  20. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by the…

  1. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  2. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  3. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-09-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  7. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  8. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  9. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  10. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  11. Trial analytics--a tool for clinical trial management.

    PubMed

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  12. Electronic tongue: An analytical gustatory tool

    PubMed Central

    Latha, Rewanthwar Swathi; Lakshmi, P. K.

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields. PMID:22470887

  13. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  14. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  15. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  16. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  17. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  18. Tool for Ranking Research Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott, Kelly; Smith, Harold

    2005-01-01

    Tool for Research Enhancement Decision Support (TREDS) is a computer program developed to assist managers in ranking options for research aboard the International Space Station (ISS). It could likely also be adapted to perform similar decision-support functions in industrial and academic settings. TREDS provides a ranking of the options, based on a quantifiable assessment of all the relevant programmatic decision factors of benefit, cost, and risk. The computation of the benefit for each option is based on a figure of merit (FOM) for ISS research capacity that incorporates both quantitative and qualitative inputs. Qualitative inputs are gathered and partly quantified by use of the time-tested analytical hierarchical process and used to set weighting factors in the FOM corresponding to priorities determined by the cognizant decision maker(s). Then by use of algorithms developed specifically for this application, TREDS adjusts the projected benefit for each option on the basis of levels of technical implementation, cost, and schedule risk. Based partly on Excel spreadsheets, TREDS provides screens for entering cost, benefit, and risk information. Drop-down boxes are provided for entry of qualitative information. TREDS produces graphical output in multiple formats that can be tailored by users.

  19. Sociologics: An Analytical Tool for Examining Socioscientific Discourse.

    ERIC Educational Resources Information Center

    Fountain, Renee-Marie

    1998-01-01

    Argues that the framework of sociologists extends commonly used analytical frameworks in socioscientific research in education. Foregrounds the social construction of knowledge and highlights the nature of knowledge production. (DDR)

  20. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  1. Medical text analytics tools for search and classification.

    PubMed

    Huang, Jimmy; An, Aijun; Hu, Vivian; Tu, Karen

    2009-01-01

    A text-analytic tool has been developed that accepts clinical medical data as input in order to produce patient details. The integrated tool has the following four characteristics. 1) It has a graphical user interface. 2) It has a free-text search tool that is designed to retrieve records using keywords such as "MI" for myocardial infarction. The result set is a display of those sentences in the medical records that contain the keywords. 3) It has three tools to classify patients based on the likelihood of being diagnosed for myocardial infarction, hypertension, or their smoking status. 4) A summary is generated for each patient selected. Large medical data sets provided by the Institute for Clinical Evaluative Sciences were used during the project.

  2. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  3. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  4. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  5. Geographical Information Systems: A Tool for Institutional Research.

    ERIC Educational Resources Information Center

    Prather, James E.; Carlson, Christina E.

    This paper addresses the application of Geographical Information Systems (GIS), a computerized tool for associating key information by geographical location, to the institutional research function at institutions of higher education. The first section investigates the potential of GIS as an analytical and planning tool for institutional…

  6. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  7. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  8. A Tool for Medical Research

    NASA Technical Reports Server (NTRS)

    1992-01-01

    California Measurements, Inc.'s PC-2 Aerosol Particle Analyzer, developed by William Chiang, a former Jet Propulsion Laboratory (JPL) engineer, was used in a study to measure the size of particles in the medical environment. Chiang has a NASA license for the JPL crystal oscillator technology and originally built the instrument for atmospheric research. In the operating room, it enabled researchers from the University of California to obtain multiple sets of data repeatedly and accurately. The study concluded that significant amounts of aerosols are generated during surgery when power tools are employed, and most of these are in the respirable size. Almost all contain blood and are small enough to pass through surgical masks. Research on the presence of blood aerosols during oral surgery had similar results. Further studies are planned to determine the possibility of HIV transmission during surgery, and the PC-2H will be used to quantify blood aerosols.

  9. Network Analytical Tool for Monitoring Global Food Safety Highlights China

    PubMed Central

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P.

    2009-01-01

    Background The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. Methodology/Principal Findings We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003 – August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. Conclusions/Significance This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios. PMID:19688088

  10. Analytical tools for single-molecule fluorescence imaging in cellulo.

    PubMed

    Leake, M C

    2014-07-01

    Recent technological advances in cutting-edge ultrasensitive fluorescence microscopy have allowed single-molecule imaging experiments in living cells across all three domains of life to become commonplace. Single-molecule live-cell data is typically obtained in a low signal-to-noise ratio (SNR) regime sometimes only marginally in excess of 1, in which a combination of detector shot noise, sub-optimal probe photophysics, native cell autofluorescence and intrinsically underlying stochasticity of molecules result in highly noisy datasets for which underlying true molecular behaviour is non-trivial to discern. The ability to elucidate real molecular phenomena is essential in relating experimental single-molecule observations to both the biological system under study as well as offering insight into the fine details of the physical and chemical environments of the living cell. To confront this problem of faithful signal extraction and analysis in a noise-dominated regime, the 'needle in a haystack' challenge, such experiments benefit enormously from a suite of objective, automated, high-throughput analysis tools that can home in on the underlying 'molecular signature' and generate meaningful statistics across a large population of individual cells and molecules. Here, I discuss the development and application of several analytical methods applied to real case studies, including objective methods of segmenting cellular images from light microscopy data, tools to robustly localize and track single fluorescently-labelled molecules, algorithms to objectively interpret molecular mobility, analysis protocols to reliably estimate molecular stoichiometry and turnover, and methods to objectively render distributions of molecular parameters.

  11. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  12. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  13. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    PubMed Central

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  14. Scalable combinatorial tools for health disparities research.

    PubMed

    Langston, Michael A; Levine, Robert S; Kilbourne, Barbara J; Rogers, Gary L; Kershenbaum, Anne D; Baktash, Suzanne H; Coughlin, Steven S; Saxton, Arnold M; Agboto, Vincent K; Hood, Darryl B; Litchveld, Maureen Y; Oyana, Tonny J; Matthews-Juarez, Patricia; Juarez, Paul D

    2014-01-01

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual's genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject. PMID:25310540

  15. Scalable Combinatorial Tools for Health Disparities Research

    PubMed Central

    Langston, Michael A.; Levine, Robert S.; Kilbourne, Barbara J.; Rogers, Gary L.; Kershenbaum, Anne D.; Baktash, Suzanne H.; Coughlin, Steven S.; Saxton, Arnold M.; Agboto, Vincent K.; Hood, Darryl B.; Litchveld, Maureen Y.; Oyana, Tonny J.; Matthews-Juarez, Patricia; Juarez, Paul D.

    2014-01-01

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject. PMID:25310540

  16. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  17. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  18. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  19. MHK Research, Tools, and Methods

    SciTech Connect

    Jepsen, Richard

    2011-11-02

    Presentation from the 2011 Water Peer Review in which principal investigator discusses improved testing, analysis, and design tools needed to more accurately model operational conditions, to optimize design parameters, and predict technology viability.

  20. Academic Analytics: A New Tool for a New Era

    ERIC Educational Resources Information Center

    Campbell, John P.; DeBlois, Peter B.; Oblinger, Diana G.

    2007-01-01

    In responding to internal and external pressures for accountability in higher education, especially in the areas of improved learning outcomes and student success, IT leaders may soon become critical partners with academic and student affairs. IT can help answer this call for accountability through "academic analytics," which is emerging as a new…

  1. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  2. ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS (ATTILA) ARCVIEW EXTENTION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape metrics, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, waters...

  3. Observatory Bibliographies as Research Tools

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, S. L.

    2013-01-01

    Traditionally, observatory bibliographies were maintained to provide insight in how successful a observatory is as measured by its prominence in the (refereed) literature. When we set up the bibliographic database for the Chandra X-ray Observatory (http://cxc.harvard.edu/cgi-gen/cda/bibliography) as part of the Chandra Data Archive ((http://cxc.harvard.edu/cda/), very early in the mission, our objective was to make it primarily a useful tool for our user community. To achieve this we are: (1) casting a very wide net in collecting Chandra-related publications; (2) including for each literature reference in the database a wealth of metadata that is useful for the users; and (3) providing specific links between the articles and the datasets in the archive that they use. As a result our users are able to browse the literature and the data archive simultaneously. As an added bonus, the rich metadata content and data links have also allowed us to assemble more meaningful statistics about the scientific efficacy of the observatory. In all this we collaborate closely with the Astrophysics Data System (ADS). Among the plans for future enhancement are the inclusion of press releases and the Chandra image gallery, linking with ADS semantic searching tools, full-text metadata mining, and linking with other observatories' bibliographies. This work is supported by NASA contract NAS8-03060 (CXC) and depends critically on the services provided by the ADS.

  4. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  5. Blogging as a Research Tool

    NASA Astrophysics Data System (ADS)

    Sweetser, Douglas

    2011-11-01

    I work on variations of the Maxwell Lagrange density using quaternions and hypercomplex products of covariant 4-derivatives and 4-potentials. The hope is to unify gravity with the symmetries found in the standard model. It is difficult for someone outside academia to get constructive criticism. I have chosen to blog once a week at Science20.com since March, 2011. Over thirty blogs have been generated, most getting more than a thousand views (high mark is 5k for ``Why Quantum Mechanics is Wierd''). The tools used for web and video blogging will be reviewed. A discussion of my efforts to represent electroweak symmetry with quaternions convinced me I was in error. Instead, my hope is to exploit the observation that U(1) is formally a subgroup of SU(2). A battle over gauge symmetry may be reviewed.

  6. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  7. Galileo's Discorsi as a Tool for the Analytical Art.

    PubMed

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  8. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  9. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  10. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  11. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  12. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  13. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    PubMed

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns. PMID:25535805

  14. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory

  15. A collaborative visual analytics suite for protein folding research.

    PubMed

    Harvey, William; Park, In-Hee; Rübel, Oliver; Pascucci, Valerio; Bremer, Peer-Timo; Li, Chenglong; Wang, Yusu

    2014-09-01

    Molecular dynamics (MD) simulation is a crucial tool for understanding principles behind important biochemical processes such as protein folding and molecular interaction. With the rapidly increasing power of modern computers, large-scale MD simulation experiments can be performed regularly, generating huge amounts of MD data. An important question is how to analyze and interpret such massive and complex data. One of the (many) challenges involved in analyzing MD simulation data computationally is the high-dimensionality of such data. Given a massive collection of molecular conformations, researchers typically need to rely on their expertise and prior domain knowledge in order to retrieve certain conformations of interest. It is not easy to make and test hypotheses as the data set as a whole is somewhat "invisible" due to its high dimensionality. In other words, it is hard to directly access and examine individual conformations from a sea of molecular structures, and to further explore the entire data set. There is also no easy and convenient way to obtain a global view of the data or its various modalities of biochemical information. To this end, we present an interactive, collaborative visual analytics tool for exploring massive, high-dimensional molecular dynamics simulation data sets. The most important utility of our tool is to provide a platform where researchers can easily and effectively navigate through the otherwise "invisible" simulation data sets, exploring and examining molecular conformations both as a whole and at individual levels. The visualization is based on the concept of a topological landscape, which is a 2D terrain metaphor preserving certain topological and geometric properties of the high dimensional protein energy landscape. In addition to facilitating easy exploration of conformations, this 2D terrain metaphor also provides a platform where researchers can visualize and analyze various properties (such as contact density) overlayed on the

  16. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  17. Research as an educational tool

    SciTech Connect

    Neff, R.; Perlmutter, D.; Klaczynski, P.

    1994-12-31

    Our students have participated in original group research projects focused on the natural environment which culminate in a written manuscript published in-house, and an oral presentation to peers, faculty, and the university community. Our goal has been to develop their critical thinking skills so that they will be more successful in high school and college. We have served ninety-three students (47.1% white, 44.1% black, 5.4% hispanic, 2.2% American Indian, 1.2% asian) from an eight state region in the southeast over the past three years. Thirty-one students have graduated from high school with over 70% enrolled in college and another thirty-four are seniors this year. We are tracking students` progress in college and are developing our own critical thinking test to measure the impact of our program. Although preliminary, the results from the critical thinking test indicated that students are often prone to logical errors; however, higher levels of critical thinking were observed on items which raised issues that conflicted with students` pre-existing beliefs.

  18. Quality management system for application of the analytical quality assurance cycle in a research project

    NASA Astrophysics Data System (ADS)

    Camargo, R. S.; Olivares, I. R. B.

    2016-07-01

    The lack of quality assurance and quality control in academic activities have been recognized by the inability to demonstrate reproducibility. This paper aim to apply a quality tool called Analytical Quality Assurance Cycle on a specific research project, supported by a Verification Programme of equipment and an adapted Quality Management System based on international standards, to provide traceability to the data generated.

  19. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  20. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    PubMed Central

    2012-01-01

    Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics. PMID:23153033

  1. Bringing Research Tools into the Classroom

    ERIC Educational Resources Information Center

    Shubert, Charles; Ceraj, Ivica; Riley, Justin

    2009-01-01

    The advancement of computer technology used for research is creating the need to change the way classes are taught in higher education. "Bringing Research Tools into the Classroom" has become a major focus of the work of the Office of Educational Innovation and Technology (OEIT) for the Dean of Undergraduate Education (DUE) at the Massachusetts…

  2. The RESET tephra database and associated analytical tools

    NASA Astrophysics Data System (ADS)

    Bronk Ramsey, Christopher; Housley, Rupert A.; Lane, Christine S.; Smith, Victoria C.; Pollard, A. Mark

    2015-06-01

    An open-access database has been set up to support the research project studying the 'Response of Humans to Abrupt Environmental Transitions' (RESET). The main methodology underlying this project was to use tephra layers to tie together and synchronise the chronologies of stratigraphic records at archaeological and environmental sites. The database has information on occurrences, and chemical compositions, of glass shards from tephra and cryptotephra deposits found across Europe. The data includes both information from the RESET project itself and from the published literature. With over 12,000 major element analyses and over 3000 trace element analyses on glass shards, relevant to 80 late Quaternary eruptions, the RESET project has generated an important archive of data. When added to the published information, the database described here has a total of more than 22,000 major element analyses and nearly 4000 trace element analyses on glass from over 240 eruptions. In addition to the database and its associated data, new methods of data analysis for assessing correlations have been developed as part of the project. In particular an approach using multi-dimensional kernel density estimates to evaluate the likelihood of tephra compositions matching is described here and tested on data generated as part of the RESET project.

  3. The use of meta-analytical tools in risk assessment for food safety.

    PubMed

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  4. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  5. New Software Framework to Share Research Tools

    NASA Astrophysics Data System (ADS)

    Milner, Kevin; Becker, Thorsten W.; Boschi, Lapo; Sain, Jared; Schorlemmer, Danijel; Waterhouse, Hannah

    2009-03-01

    Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. The software provides a stand-alone open-source package that allows users to operate in a “black box” mode, which hides implementation details, while also allowing them to dig deeper into the underlying source code. The overlying user interfaces are written in the Python programming language using a modern, object-oriented design, including graphical user interactions. SEATREE, which provides an interface to a range of new and existing lower level programs that can be written in any computer programming language, may in the long run contribute to new ways of sharing scientific research. By sharing both data and modeling tools in a consistent framework, published (numerical) experiments can be made truly reproducible again.

  6. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Revisions to models and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... Management District, and other non-Federal sponsors shall rely on the best available science including...

  7. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  8. The Use of Economic Analytical Tools in Quantifying and Measuring Educational Benefits and Costs.

    ERIC Educational Resources Information Center

    Holleman, I. Thomas, Jr.

    The general objective of this study was to devise quantitative guidelines that school officials can accurately follow in using benefit-cost analysis, cost-effectiveness analysis, ratio analysis, and other similar economic analytical tools in their particular local situations. Specifically, the objectives were to determine guidelines for the…

  9. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  10. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  11. Measuring the Bright Side of Being Blue: A New Tool for Assessing Analytical Rumination in Depression

    PubMed Central

    Barbic, Skye P.; Durisko, Zachary; Andrews, Paul W.

    2014-01-01

    Background Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR) is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. Methods Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. Results Data were high quality (<1% missing; high reliability: Cronbach's alpha  = 0.92, test-retest intraclass correlations >0.81; evidence for divergent validity). Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df  = 76, p = 0.07), with high reliability (rp = 0.86), ordered response scale structure, and no item bias (gender, age, time). Conclusion Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ) that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major

  12. Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.

    SciTech Connect

    Wong, Pak C.; Leung, Lai-Yung R.; Lu, Ning; Scott, Michael J.; Mackey, Patrick S.; Foote, Harlan P.; Correia, James; Taylor, Zachary T.; Xu, Jianhua; Unwin, Stephen D.; Sanfilippo, Antonio P.

    2009-09-01

    We describe our ongoing efforts to design and develop a collaborative visual analytics tool to interactively model social and technological change of our society in a future setting. The work involves an interdisciplinary team of scientists from atmospheric physics, electrical engineering, building engineering, social sciences, economics, public policy, and national security. The goal of the collaborative tool is to predict the impact of global climate change on the U.S. power grids and its implications for society and national security. These future scenarios provide critical assessment and information necessary for policymakers and stakeholders to help formulate a coherent, unified strategy toward shaping a safe and secure society. The paper introduces the problem background and related work, explains the motivation and rationale behind our design approach, presents our collaborative visual analytics tool and usage examples, and finally shares the development challenge and lessons learned from our investigation.

  13. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  14. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  15. Oceanographic visualization interactive research tool (OVIRT)

    NASA Astrophysics Data System (ADS)

    Moorhead, Robert J., II; Hamann, Bernd; Everitt, Cass; Jones, Stephen C.; McAllister, Joel; Barlow, Jonathan

    1994-04-01

    The Oceanographic Visualization Interactive Research Tool (OVIRT) was developed to explore the utility of scalar field volume rendering in visualizing environmental ocean data and to extend some of the classical 2D oceanographic displays into a 3D visualization environment. It has five major visualization tools: cutting planes, minicubes, isosurfaces, sonic-surfaces, and direct volume rendering. The cutting planes tool provides three orthogonal cutting planes which can be interactively moved through the volume. The minicubes routine renders small cubes whose faces are shaded according to function value. The isosurface tool is conceptually similar to the well-known marching cubes technique. The sonic surfaces are an extension of the 2D surfaces which have been classically used to display acoustic propagation paths and inflection lines within the ocean. The surfaces delineate the extent and axis of the shallow and deep sound channels. The direct volume rendering (DVR) techniques give a global view of the data. Other features include the ability to overlay the shoreline and inlay the bathymetry. There are multiple colormaps, an automatic histogramming feature, a macro and scripting capability, a picking function, and the ability to display animations of DVR imagery. There is a network feature to allow computationally expensive functions to be executed on remote machines.

  16. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  17. Group analytic psychotherapy (im)possibilities to research

    PubMed Central

    Vlastelica, Mirela

    2011-01-01

    In the course of group analytic psychotherapy, where we discovered the power of the therapeutic effects, there occurred the need of group analytic psychotherapy researches. Psychotherapeutic work in general, and group psychotherapy in particular, are hard to measure and put into some objective frames. Researches, i. e. measuring of changes in psychotherapy is a complex task, and there are large disagreements. For a long time, the empirical-descriptive method was the only way of research in the field of group psychotherapy. Problems of researches in group psychotherapy in general, and particularly in group analytic psychotherapy can be reviewed as methodology problems at first, especially due to unrepeatability of the therapeutic process. The basic polemics about measuring of changes in psychotherapy is based on the question whether a change is to be measured by means of open measuring of behaviour or whether it should be evaluated more finely by monitoring inner psychological dimensions. Following the therapy results up, besides providing additional information on the patient's improvement, strengthens the psychotherapist's self-respect, as well as his respectability and credibility as a scientist. PMID:25478094

  18. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  19. An Analytical Thermal Model for Autonomous Soaring Research

    NASA Technical Reports Server (NTRS)

    Allen, Michael

    2006-01-01

    A viewgraph presentation describing an analytical thermal model used to enable research on autonomous soaring for a small UAV aircraft is given. The topics include: 1) Purpose; 2) Approach; 3) SURFRAD Data; 4) Convective Layer Thickness; 5) Surface Heat Budget; 6) Surface Virtual Potential Temperature Flux; 7) Convective Scaling Velocity; 8) Other Calculations; 9) Yearly trends; 10) Scale Factors; 11) Scale Factor Test Matrix; 12) Statistical Model; 13) Updraft Strength Calculation; 14) Updraft Diameter; 15) Updraft Shape; 16) Smoothed Updraft Shape; 17) Updraft Spacing; 18) Environment Sink; 19) Updraft Lifespan; 20) Autonomous Soaring Research; 21) Planned Flight Test; and 22) Mixing Ratio.

  20. Units of analysis in task-analytic research.

    PubMed

    Haring, T G; Kennedy, C H

    1988-01-01

    We develop and discuss four criteria for evaluating the appropriateness of units of analysis for task-analytic research and suggest potential alternatives to the units of analysis currently used. Of the six solutions discussed, the most commonly used unit of analysis in current behavior analytic work, percentage correct, meets only one of the four criteria. Five alternative units of analysis are presented and evaluated: (a) percentage of opportunities to perform meeting criterion, (b) trials to criteria, (c) cumulative competent performances, (d) percentage correct with competent performance coded, and (e) percentage correct with competent performance coded and a grid showing performance on individual steps of the task analysis. Of the solutions evaluated, only one--percentage correct with competent performance coded and a task analysis grid--met all four criteria.

  1. Tools and collaborative environments for bioinformatics research

    PubMed Central

    Giugno, Rosalba; Pulvirenti, Alfredo

    2011-01-01

    Advanced research requires intensive interaction among a multitude of actors, often possessing different expertise and usually working at a distance from each other. The field of collaborative research aims to establish suitable models and technologies to properly support these interactions. In this article, we first present the reasons for an interest of Bioinformatics in this context by also suggesting some research domains that could benefit from collaborative research. We then review the principles and some of the most relevant applications of social networking, with a special attention to networks supporting scientific collaboration, by also highlighting some critical issues, such as identification of users and standardization of formats. We then introduce some systems for collaborative document creation, including wiki systems and tools for ontology development, and review some of the most interesting biological wikis. We also review the principles of Collaborative Development Environments for software and show some examples in Bioinformatics. Finally, we present the principles and some examples of Learning Management Systems. In conclusion, we try to devise some of the goals to be achieved in the short term for the exploitation of these technologies. PMID:21984743

  2. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  3. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  4. Analytical prediction of chatter stability for variable pitch and variable helix milling tools

    NASA Astrophysics Data System (ADS)

    Sims, N. D.; Mann, B.; Huyanan, S.

    2008-11-01

    Regenerative chatter is a self-excited vibration that can occur during milling and other machining processes. It leads to a poor surface finish, premature tool wear, and potential damage to the machine or tool. Variable pitch and variable helix milling tools have been previously proposed to avoid the onset of regenerative chatter. Although variable pitch tools have been considered in some detail in previous research, this has generally focussed on behaviour at high radial immersions. In contrast there has been very little work focussed on predicting the stability of variable helix tools. In the present study, three solution processes are proposed for predicting the stability of variable pitch or helix milling tools. The first is a semi-discretisation formulation that performs spatial and temporal discretisation of the tool. Unlike previously published methods this can predict the stability of variable pitch or variable helix tools, at low or high radial immersions. The second is a time-averaged semi-discretisation formulation that assumes time-averaged cutting force coefficients. Unlike previous work, this can predict stability of variable helix tools at high radial immersion. The third is a temporal-finite element formulation that can predict the stability of variable pitch tools with a constant uniform helix angle, at low radial immersion. The model predictions are compared to previously published work on variable pitch tools, along with time-domain model simulations. Good agreement is found with both previously published results and the time-domain model. Furthermore, cyclic-fold bifurcations were found to exist for both variable pitch and variable helix tools at lower radial immersions.

  5. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  6. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  7. Analytical Ultracentrifugation as a Tool to Study Nonspecific Protein–DNA Interactions

    PubMed Central

    Yang, Teng-Chieh; Catalano, Carlos Enrique; Maluf, Nasib Karl

    2016-01-01

    Analytical ultracentrifugation (AUC) is a powerful tool that can provide thermodynamic information on associating systems. Here, we discuss how to use the two fundamental AUC applications, sedimentation velocity (SV), and sedimentation equilibrium (SE), to study nonspecific protein–nucleic acid interactions, with a special emphasis on how to analyze the experimental data to extract thermodynamic information. We discuss three specific applications of this approach: (i) determination of nonspecific binding stoichiometry of E. coli integration host factor protein to dsDNA, (ii) characterization of nonspecific binding properties of Adenoviral IVa2 protein to dsDNA using SE-AUC, and (iii) analysis of the competition between specific and nonspecific DNA-binding interactions observed for E. coli integration host factor protein assembly on dsDNA. These approaches provide powerful tools that allow thermodynamic interrogation and thus a mechanistic understanding of how proteins bind nucleic acids by both specific and nonspecific interactions. PMID:26412658

  8. Telerehabilitation: Policy Issues and Research Tools

    PubMed Central

    Seelman, Katherine D.; Hartman, Linda M.

    2009-01-01

    The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR) policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF). Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. PMID:25945162

  9. Process analytical tools for monitoring, understanding, and control of pharmaceutical fluidized bed granulation: A review.

    PubMed

    Burggraeve, Anneleen; Monteyne, Tinne; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2013-01-01

    Fluidized bed granulation is a widely applied wet granulation technique in the pharmaceutical industry to produce solid dosage forms. The process involves the spraying of a binder liquid onto fluidizing powder particles. As a result, the (wetted) particles collide with each other and form larger permanent aggregates (granules). After spraying the required amount of granulation liquid, the wet granules are rapidly dried in the fluid bed granulator. Since the FDA launched its Process Analytical Technology initiative (and even before), a wide range of analytical process sensors has been used for real-time monitoring and control of fluid bed granulation processes. By applying various data analysis techniques to the multitude of data collected from the process analyzers implemented in fluid bed granulators, a deeper understanding of the process has been achieved. This review gives an overview of the process analytical technologies used during fluid bed granulation to monitor and control the process. The fundamentals of the mechanisms contributing to wet granule growth and the characteristics of fluid bed granulation processing are briefly discussed. This is followed by a detailed overview of the in-line applied process analyzers, contributing to improved fluid bed granulation understanding, modeling, control, and endpoint detection. Analysis and modeling tools enabling the extraction of the relevant information from the complex data collected during granulation and the control of the process are highlighted.

  10. Healthcare, molecular tools and applied genome research.

    PubMed

    Groves, M

    2000-11-01

    Biotechnology 2000 offered a rare opportunity for scientists from academia and industry to present and discuss data in fields as diverse as environmental biotechnology and applied genome research. The healthcare section of the meeting encompassed a number of gene therapy delivery systems that are successfully treating genetic disorders. Beta-thalassemia is being corrected in mice by continous erythropoeitin delivery from engineered muscles cells, and from naked DNA electrotransfer into muscles, as described by Dr JM Heard (Institut Pasteur, Paris, France). Dr Reszka (Max-Delbrueck-Centrum fuer Molekulare Medizin, Berlin, Germany), meanwhile, described a treatment for liver metastasis in the form of a drug carrier emolization system, DCES (Max-Delbrueck-Centrum fuer Molekulare Medizin), composed of surface modified liposomes and a substance for chemo-occlusion, which drastically reduces the blood supply to the tumor and promotes apoptosis, necrosis and antiangiogenesis. In the molecular tools section, Willem Stemmer (Maxygen Inc, Redwood City, CA, USA) gave an insight into the importance that techniques, such as molecular breeding (DNA shuffling), have in the evolution of molecules with improved function, over a range of fields including pharmaceuticals, vaccines, agriculture and chemicals. Technologies, such as ribosome display, which can incorporate the evolution and the specific enrichment of proteins/peptides in cycles of selection, could play an enormous role in the production of novel therapeutics and diagnostics in future years, as explained by Andreas Plückthun (Institute of Biochemistry, University of Zurich, Switzerland). Applied genome research offered technologies, such as the 'in vitro expression cloning', described by Dr Zwick (Promega Corp, Madison, WI, USA), are providing a functional analysis for the overwhelming flow of data emerging from high-throughput sequencing of genomes and from high-density gene expression microarrays (DNA chips). The

  11. Some Tooling for Manufacturing Research Reactor Fuel Plates

    SciTech Connect

    Knight, R.W.

    1999-10-03

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment.

  12. VAO Tools Enhance CANDELS Research Productivity

    NASA Astrophysics Data System (ADS)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  13. The Core Analytics of Randomized Experiments for Social Research. MDRC Working Papers on Research Methodology

    ERIC Educational Resources Information Center

    Bloom, Howard S.

    2006-01-01

    This chapter examines the core analytic elements of randomized experiments for social research. Its goal is to provide a compact discussion for faculty members, graduate students, and applied researchers of the design and analysis of randomized experiments for measuring the impacts of social or educational interventions. Design issues considered…

  14. Molecular modelling: An analytical tool with a predictive character for investigating reactivity in molten salt media.

    NASA Astrophysics Data System (ADS)

    Picard, Gérard S.; Bouyer, Frédéric C.

    1995-04-01

    Possibilities offered by Molecular Modelling for studying homogeneous and interfacial processes and reactions in melts are discussed. A few typical illustrative examples covering some of the main research fields of molten salt chemistry and electrochemistry are given. Quantum chemistry calculations, Molecular Dynamics and Monte Carlo methods appear to be fantastic tools for analyzing and predicting reactivity in molten salts.

  15. ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  16. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  17. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  18. Revisiting the use of 'place' as an analytic tool for elucidating geographic issues central to Canadian rural palliative care.

    PubMed

    Giesbrecht, Melissa; Crooks, Valorie A; Castleden, Heather; Schuurman, Nadine; Skinner, Mark W; Williams, Allison M

    2016-09-01

    In 2010, Castleden and colleagues published a paper in this journal using the concept of 'place' as an analytic tool to understand the nature of palliative care provision in a rural region in British Columbia, Canada. This publication was based upon pilot data collected for a larger research project that has since been completed. With the addition of 40 semi-structured interviews with users and providers of palliative care in four other rural communities located across Canada, we revisit Castleden and colleagues' (2010) original framework. Applying the concept of place to the full dataset confirmed the previously published findings, but also revealed two new place-based dimensions related to experiences of rural palliative care in Canada: (1) borders and boundaries; and (2) 'making' place for palliative care progress. These new findings offer a refined understanding of the complex interconnections between various dimensions of place and palliative care in rural Canada. PMID:27521815

  19. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  20. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    PubMed

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-25

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  1. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  2. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  3. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  4. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  5. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  6. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    PubMed

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb.

  7. Streamlining Research by Using Existing Tools

    PubMed Central

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and multi-site collaborations from scratch—reinventing the wheel. Our team developed a compendium of resources to address inefficiencies and researchers’ unmet needs and compiled them in a research toolkit website (www.ResearchToolkit.org). Through our work, we identified philosophical and operational issues related to disseminating the toolkit to the research community. We explore these issues here, with implications for the nation’s investment in biomedical research. PMID:21884513

  8. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  9. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  10. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  11. Tools for Ephemeral Gully Erosion Process Research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  12. Participatory Research: A Tool for Extension Educators

    ERIC Educational Resources Information Center

    Tritz, Julie

    2014-01-01

    Given their positions in communities across the United States, Extension educators are poised to have meaningful partnerships with the communities they serve. This article presents a case for the use of participatory research, which is a departure from more conventional forms of research based on objectivity, researcher distance, and social…

  13. Research issues in sustainable consumption: toward an analytical framework for materials and the environment.

    PubMed

    Thomas, Valerie M; Graedel, T E

    2003-12-01

    We define key research questions as a stimulus to research in the area of industrial ecology. The first group of questions addresses analytical support for green engineering and environmental policy. They relate to (i) tools for green engineering, (ii) improvements in life cycle assessment, (iii) aggregation of environmental impacts, and (iv) effectiveness of a range of innovative policy approaches. The second group of questions addresses the dynamics of technology, economics, and environmental impacts. They relate to (v) the environmental impacts of material and energy consumption, (vi) the potential for material efficiency, (vii) the relation of technological and economic development to changes in consumption patterns, and (viii) the potential for technology to overcome environmental impacts and constraints. Altogether, the questions create an intellectual agenda for industrial ecology and integrate the technological and social aspects of sustainability.

  14. Narratives and Activity Theory as Reflective Tools in Action Research

    ERIC Educational Resources Information Center

    Stuart, Kaz

    2012-01-01

    Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…

  15. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  16. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. PMID:25406843

  17. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives.

  18. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  19. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    ERIC Educational Resources Information Center

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  20. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). PMID:26873463

  1. Desktop modeling as a management tool for budgeting, forecasting, and reporting in an analytical laboratory

    SciTech Connect

    Hodge, C.A.

    1995-07-01

    Managers are often required to quickly and accurately estimate resource needs. At times, additional work can be absorbed without additional resources. At other times, threshold resource boundaries are exceeded requiring an additional quantum of a specific resource. Cost savings` estimates, resulting from a reduction in efforts, are also increasingly becoming a requirement of today`s managers. The modeling effort described in this paper was designed to estimate instrumentation and manpower resource needs for an analytical laboratory. It was written using only simple spreadsheet software. Analysis can be readily performed with a minimum of input and results obtained in a matter of minutes. This model has been tuned with many years of empirical data yielding a high degree of capability. The model was expanded to meet other needs. It can be used to justify capital expenditure when the ultimate result is cost savings; to examine procedures and operations for efficiency increases; and for reporting and regulatory compliance. This paper demonstrates that accurate and credible estimates of resource needs can be readily obtained with a minimum of effort or specialized knowledge employing only tools that are readily available in today`s business environment.

  2. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-01

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  3. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    NASA Astrophysics Data System (ADS)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  4. Volume, Variety and Veracity of Big Data Analytics in NASA's Giovanni Tool

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Hegde, M.; Smit, C.; Pan, J.; Bryant, K.; Chidambaram, C.; Zhao, P.

    2013-12-01

    Earth Observation data have posed challenges to NASA users ever since the launch of several satellites around the turn of the century, generating volumes now measured in petabytes, a volume growth further increased by models assimilating the satellite data. One important approach to bringing Big Data Analytic capabilities to bear on the Volume of data has been the provision of server-side analysis capabilities. For instance, the Geospatial Interactive Online Visualization ANd aNalysis (Giovanni) tool provides a web interface to large volumes of gridded data from several EOSDIS data centers. Giovanni's main objective is to allow the user to explore its data holdings using various forms of visualization and data summarization or aggregation algorithms, thus allowing the user to examine statistics and pictures for the overall data, while eventually acquiring only the most useful data. Thus much of the preprocessing and data reduction aspects can take place on the server, delivering manageable information quantities to the user. In addition to Volume, Giovanni uses open standards to tackle the Variety aspect of Big Data, incorporating data stored in several formats, from several data centers, and making them available in a uniform data format and structure to both the Giovanni algorithms and the end user. The Veracity aspect of Big Data, perhaps the stickiest of wickets, is enhanced through features that enable reproducibility (provenance and URL-driven workflows), and by a Help Desk staffed by scientists with expertise in the science data.

  5. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    PubMed

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences.

  6. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be... models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to...

  7. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  8. Using Wordle as a Supplementary Research Tool

    ERIC Educational Resources Information Center

    McNaught, Carmel; Lam, Paul

    2010-01-01

    A word cloud is a special visualization of text in which the more frequently used words are effectively highlighted by occupying more prominence in the representation. We have used Wordle to produce word-cloud analyses of the spoken and written responses of informants in two research projects. The product demonstrates a fast and visually rich way…

  9. Visualization tools for comprehensive test ban treaty research

    SciTech Connect

    Edwards, T.L.; Harris, J.M.; Simons, R.W.

    1997-08-01

    This paper focuses on tools used in Data Visualization efforts at Sandia National Laboratories under the Department of Energy CTBT R&D program. These tools provide interactive techniques for the examination and interpretation of scientific data, and can be used for many types of CTBT research and development projects. We will discuss the benefits and drawbacks of using the tools to display and analyze CTBT scientific data. While the tools may be used for everyday applications, our discussion will focus on the use of these tools for visualization of data used in research and verification of new theories. Our examples focus on uses with seismic data, but the tools may also be used for other types of data sets. 5 refs., 6 figs., 1 tab.

  10. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  11. Methodology and tools for astronomical research

    NASA Astrophysics Data System (ADS)

    Giacconi, Riccardo

    1997-03-01

    We are fortunate to live in a new heroic period of astronomy. In the second half of this century our understanding of the universe has undergone an exciting and profound change. This has been brought about by a number of factors: the development of physics, the discovery of astronomy, the extension of observations to all wavelengths from radio to x rays and finally the development of computers. These new findings and tools have permitted us to elaborate new theories and models of the universe as a whole. In my own mind I see three great themes for the next century of astronomy. The first is the quest for physics. The second is the quest for the origins. The third is what I could call the quest for living space. To pursue these themes we study the universe in the entire electromagnetic spectrum of wavelengths with ever larger telescopes and ever more refined detectors and instruments as we heard at this conference. The new facilities are producing and will produce ever larger quantities of data in such amounts that the information cannot be received, calibrated, analyzed and even understood in traditional ways.

  12. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  13. Simulation tools for robotics research and assessment

    NASA Astrophysics Data System (ADS)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  14. Participant-Centric Initiatives: Tools to Facilitate Engagement In Research.

    PubMed

    Anderson, Nicholas; Bragg, Caleb; Hartzler, Andrea; Edwards, Kelly

    2012-12-01

    Clinical genomic research faces increasing challenges in establishing participant privacy and consent processes that facilitate meaningful choice and communication capacity for longitudinal and secondary research uses. There are an evolving range of participant-centric initiatives that combine web-based informatics tools with new models of engagement and research collaboration. These emerging initiatives may become valuable approaches to support large-scale and longitudinal research studies. We highlight and discuss four types of emerging initiatives for engaging and sustaining participation in research.

  15. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  16. Improving Teaching with Collaborative Action Research: An ASCD Action Tool

    ERIC Educational Resources Information Center

    Cunningham, Diane

    2011-01-01

    Once you've established a professional learning community (PLC), you need to get this ASCD (Association for Supervision and Curriculum Development) action tool to ensure that your PLC stays focused on addressing teaching methods and student learning problems. This ASCD action tool explains how your PLC can use collaborative action research to…

  17. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  18. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    PubMed

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products. PMID:27394712

  19. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  20. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  1. Innovations in scholarly communication - global survey on research tool usage.

    PubMed

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents' demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  2. Innovations in scholarly communication - global survey on research tool usage

    PubMed Central

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents’ demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  3. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  4. Experimental and Analytical Research on Fracture Processes in ROck

    SciTech Connect

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  5. CDPP tools : Promoting research and education with AMDA, 3DView and the propagation tool in space physics

    NASA Astrophysics Data System (ADS)

    Genot, Vincent; Cecconi, Baptiste

    The CDPP (Centre de Données de la Physique des Plasmas, http://cdpp.eu/), the French data center for plasma physics, is engaged for more than a decade in the archiving and dissemination of plasma data products from space missions and ground observatories. Besides these activities, the CDPP developed an online analysis tool, AMDA (http://amda.cdpp.eu/). It enables in depth analysis of large amount of space physics, planetary and model data through dedicated functionalities such as: visualization, data mining, cataloguing ... It is used (about 250 connections per month) by scientists for their own research, but also by graduate students in the classroom and for dedicated projects. AMDA is ideally complemented by two companion tools also developed at CDPP : 3DView (http://3dview.cdpp.eu/) which provides immersive data visualisations in planetary environments and the Propagation Tool (http://propagationtool.cdpp.eu/) which enables tracking of solar perturbations in the heliosphere with different analytical models and white light imaging techniques. This presentation will focus on some scientific cases combining the use of the three tools. (2.1) Data Mining and Intelligent Systems for Massive Data Sets

  6. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis. PMID:27391182

  7. The WWW Cabinet of Curiosities: A Serendipitous Research Tool

    ERIC Educational Resources Information Center

    Arnold, Josie

    2012-01-01

    This paper proposes that the WWW is able to be fruitfully understood as a research tool when we utilise the metaphor of the cabinet of curiosities, the wunderkammer. It unpeels some of the research attributes of the metaphor as it reveals the multiplicity of connectivity on the web that provides serendipitous interactions between unexpected…

  8. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  9. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  10. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  11. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  12. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  13. Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems

    PubMed Central

    Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-01-01

    Background Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. Objective The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Methods Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Results Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Conclusions Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the

  14. Survey design research: a tool for answering nursing research questions.

    PubMed

    Siedlecki, Sandra L; Butler, Robert S; Burchill, Christian N

    2015-01-01

    The clinical nurse specialist is in a unique position to identify and study clinical problems in need of answers, but lack of time and resources may discourage nurses from conducting research. However, some research methods can be used by the clinical nurse specialist that are not time-intensive or cost prohibitive. The purpose of this article is to explain the utility of survey methodology for answering a number of nursing research questions. The article covers survey content, reliability and validity issues, sample size considerations, and methods of survey delivery.

  15. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  16. Identifying and Tracing Persistent Identifiers of Research Resources : Automation, Metrics and Analytics

    NASA Astrophysics Data System (ADS)

    Maull, K. E.; Hart, D.; Mayernik, M. S.

    2015-12-01

    Formal and informal citations and acknowledgements for research infrastructures, such as data collections, software packages, and facilities, are an increasingly important function of attribution in scholarly literature. While such citations provide the appropriate links, even if informally, to their origins, they are often done so inconsistently, making such citations hard to analyze. While significant progress has been made in the past few years in the development of recommendations, policies, and procedures for creating and promoting citable identifiers, progress has been mixed in tracking how data sets and other digital infrastructures have actually been identified and cited in the literature. Understanding the full extent and value of research infrastructures through the lens of scholarly literature requires significant resources, and thus, we argue must rely on automated approaches that mine and track persistent identifiers to scientific resources. Such automated approaches, however, face a number of unique challenges, from the inconsistent and informal referencing practices of authors, to unavailable, embargoed or hard-to-obtain full-text resources for text analytics, to inconsistent and capricious impact metrics. This presentation will discuss work to develop and evaluate tools for automating the tracing of research resource identification and referencing in the research literature via persistent citable identifiers. Despite the impediments, automated processes are of considerable importance in enabling these traceability efforts to scale, as the numbers of identifiers being created for unique scientific resources continues to grow rapidly. Such efforts, if successful, should improve the ability to answer meaningful questions about research resources as they continue to grow as a target of advanced analyses in research metrics.

  17. AN ANALYTICAL APPROACH TO RESEARCH ON INSTRUCTIONAL METHODS.

    ERIC Educational Resources Information Center

    GAGE, N.L.

    THE APPROACH USED AT STANFORD UNIVERSITY TO RESEARCH ON TEACHING WAS DISCUSSED, AND THE AUTHOR EXPLAINED THE CONCEPTS OF "TECHNICAL SKILLS,""MICROTEACHING," AND "MICROCRITERIA" THAT WERE THE BASIS OF THE DEVELOPMENT OF THIS APPROACH TO RESEARCH AND TO STANFORD'S SECONDARY-TEACHER EDUCATION PROGRAM. THE AUTHOR PRESENTED A BASIC DISTINCTION BETWEEN…

  18. Dataset-Driven Research to Support Learning and Knowledge Analytics

    ERIC Educational Resources Information Center

    Verbert, Katrien; Manouselis, Nikos; Drachsler, Hendrik; Duval, Erik

    2012-01-01

    In various research areas, the availability of open datasets is considered as key for research and application purposes. These datasets are used as benchmarks to develop new algorithms and to compare them to other algorithms in given settings. Finding such available datasets for experimentation can be a challenging task in technology enhanced…

  19. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  20. Analytical ultracentrifugation: A versatile tool for the characterisation of macromolecular complexes in solution.

    PubMed

    Patel, Trushar R; Winzor, Donald J; Scott, David J

    2016-02-15

    Analytical ultracentrifugation, an early technique developed for characterizing quantitatively the solution properties of macromolecules, remains a powerful aid to structural biologists in their quest to understand the formation of biologically important protein complexes at the molecular level. Treatment of the basic tenets of the sedimentation velocity and sedimentation equilibrium variants of analytical ultracentrifugation is followed by considerations of the roles that it, in conjunction with other physicochemical procedures, has played in resolving problems encountered in the delineation of complex formation for three biological systems - the cytoplasmic dynein complex, mitogen-activated protein kinase (ERK2) self-interaction, and the terminal catalytic complex in selenocysteine synthesis. PMID:26555086

  1. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    SciTech Connect

    Hayden, D. W.

    2004-11-01

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried to develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of

  2. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  3. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  4. Trends in Behavior-Analytic Gambling Research and Treatment.

    PubMed

    Dixon, Mark R; Whiting, Seth W; Gunnarsson, Karl F; Daar, Jacob H; Rowsey, Kyle E

    2015-10-01

    The purpose of the present review was to analyze research outcomes for all gambling studies reported in the behavior analysis literature. We used the search term "gambling" to identify articles that were published in behaviorally oriented journals between the years 1992 and 2012 and categorized the content of each article as empirical or conceptual. Next, we examined and categorized the empirical articles by inclusion of an experimental manipulation and treatment to alleviate at least some aspect of pathological gambling, participant population used, type of gambling task employed in the research, whether the participants in the study actually gambled, and the behavioral phenomena of interest. The results show that the rate of publication of gambling research has increased in the last 6 years, and a vast majority of articles are empirical. Of the empirical articles, examinations of treatment techniques or methods are scarce; slot machine play is the most represented form of gambling, and slightly greater than half of the research included compensation based on gambling outcomes within experiments. We discuss implications and future directions based on these observations of the published literature. PMID:27606170

  5. International News Communication Research: A Meta-Analytic Assessment.

    ERIC Educational Resources Information Center

    Tsang, Kuo-jen

    A survey of "Journalism Quarterly,""Gazette,""Public Opinion Quarterly,""Journal of Broadcasting," and "Journal of Communication" reveals that the early research on international news flow or coverage emphasized two aspects of news: (1) how the United States was portrayed in the media of other nations, and (2) what the effect of American society…

  6. Using High-Tech Tools for Student Research.

    ERIC Educational Resources Information Center

    Plati, Thomas

    1988-01-01

    Discusses incorporating high technology research tools into the curriculum for grades 5 through 12 in Shrewsbury, Massachusetts, public schools. The use of CD-ROM and online databases is described, teacher training is discussed, and steps to integrate this new technology are listed, including budget proposals and evaluation. (LRW)

  7. Measurement and Research Tools. Symposium 37. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This symposium on measurement and research tools consists of three presentations. "An Examination of the Multiple Intelligences Developmental Assessment Scales (MIDAS)" (Albert Wiswell et al.) explores MIDAS's psychometric saliency. Findings indicates this instrument represents an incomplete attempt to develop a valid assessment of multiple…

  8. A Tool for Mapping Research Skills in Undergraduate Curricula

    ERIC Educational Resources Information Center

    Fraser, Gillian A.; Crook, Anne C.; Park, Julian R.

    2007-01-01

    There has been considerable interest recently in the teaching of skills to undergraduate students. However, existing methods for collating data on how much, where and when students are taught and assessed skills have often been shown to be time-consuming and ineffective. Here, we outline an electronic research skills audit tool that has been…

  9. Database Advisor: A New Tool for K-12 Research Projects.

    ERIC Educational Resources Information Center

    Berteaux, Susan S.; Strong, Sandra S.

    The Database Advisor (DBA) is a tool designed to guide users to the most appropriate World Wide Web-based databases for their research. Developed in 1997 by the Science Libraries at the University of California, San Diego (UCSD), DBA is a Web-based front-end to bibliographic and full-text databases to which UCSD has remote access. DBA allows the…

  10. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  11. Facilitating Metacognitive Talk: A Research and Learning Tool

    ERIC Educational Resources Information Center

    Wall, Kate; Higgins, Steve

    2006-01-01

    This paper describes a research tool which aims to gather data about pupils' views of learning and teaching, with a particular focus on their thinking about their learning (metacognition). The approach has proved to be an adaptable and effective technique to examine different learning contexts from the pupils' perspective, while also acting as an…

  12. Research for research: tools for knowledge discovery and visualization.

    PubMed Central

    Van Mulligen, Erik M.; Van Der Eijk, Christiaan; Kors, Jan A.; Schijvenaars, Bob J. A.; Mons, Barend

    2002-01-01

    This paper describes a method to construct from a set of documents a spatial representation that can be used for information retrieval and knowledge discovery. The proposed method has been implemented in a prototype system and allows the researcher to browse interactively and in real-time a network of relationships obtained from a set of full text articles. These relationships are combined with the potential relationships between concepts as defined in the UMLS semantic network. The browser allows the user to select a seed term and find all related concepts, to find a path between concepts (hypothesis testing), and to retrieve the references to documents or database entries that support the relationship between concepts. PMID:12463942

  13. Focus group discussion: a tool for health and medical research.

    PubMed

    Wong, L P

    2008-03-01

    Focus group discussion is a research methodology in which a small group of participants gather to discuss a specified topic or an issue to generate data. The main characteristic of a focus group is the interaction between the moderator and the group, as well as the interaction between group members. The objective is to give the researcher an understanding of the participants' perspective on the topic in discussion. Focus groups are rapidly gaining popularity in health and medical research. This paper presents a general introduction of the use of focus groups as a research tool within the context of health research, with the intention of promoting its use among researchers in healthcare. A detailed methodology for the conduct of focus groups and analysis of focus group data are discussed. The potentials and limitations of this qualitative research technique are also highlighted.

  14. Visual analytical tool for evaluation of 10-year perioperative transfusion practice at a children's hospital.

    PubMed

    Gálvez, Jorge A; Ahumada, Luis; Simpao, Allan F; Lin, Elaina E; Bonafide, Christopher P; Choudhry, Dhruv; England, William R; Jawad, Abbas F; Friedman, David; Sesok-Pizzini, Debora A; Rehman, Mohamed A

    2014-01-01

    Children are a vulnerable population in the operating room, and are particularly at risk of complications from unanticipated hemorrhage. The decision to prepare blood products prior to surgery varies depending on the personal experience of the clinician caring for the patient. We present the first application of a data visualization technique to study large datasets in the context of blood product transfusions at a tertiary pediatric hospital. The visual analytical interface allows real-time interaction with datasets from 230 000 procedure records. Clinicians can use the visual analytical interface to analyze blood product usage based on procedure- and patient-specific factors, and then use that information to guide policies for ordering blood products.

  15. Biosensors as new analytical tool for detection of Genetically Modified Organisms (GMOs).

    PubMed

    Minunni, M; Tombelli, S; Mariotti, E; Mascini, M; Mascini, M

    2001-04-01

    Three different biosensors for detection of Genetically Modified Organisms (GMOs) are presented. The sensing principle is based on the affinity interaction between nucleic acids: the probe is immobilised on the sensor surface and the target analyte is free in solution. The immobilised probes are specific for most inserted sequences in GMOs: the promoter P35S and the terminator TNOS. Electrochemical methods with screen-printed electrodes, piezoelectric and optical (SPR) transduction principles were applied.

  16. Benchmarking biology research organizations using a new, dedicated tool.

    PubMed

    van Harten, Willem H; van Bokhorst, Leonard; van Luenen, Henri G A M

    2010-02-01

    International competition forces fundamental research organizations to assess their relative performance. We present a benchmark tool for scientific research organizations where, contrary to existing models, the group leader is placed in a central position within the organization. We used it in a pilot benchmark study involving six research institutions. Our study shows that data collection and data comparison based on this new tool can be achieved. It proved possible to compare relative performance and organizational characteristics and to generate suggestions for improvement for most participants. However, strict definitions of the parameters used for the benchmark and a thorough insight into the organization of each of the benchmark partners is required to produce comparable data and draw firm conclusions.

  17. Technical phosphoproteomic and bioinformatic tools useful in cancer research

    PubMed Central

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  18. Technical phosphoproteomic and bioinformatic tools useful in cancer research.

    PubMed

    López, Elena; Wesselink, Jan-Jaap; López, Isabel; Mendieta, Jesús; Gómez-Puertas, Paulino; Muñoz, Sarbelio Rodríguez

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  19. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  20. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  1. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    PubMed

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  2. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  3. Analytical Tools To Distinguish the Effects of Localization Error, Confinement, and Medium Elasticity on the Velocity Autocorrelation Function

    PubMed Central

    Weber, Stephanie C.; Thompson, Michael A.; Moerner, W.E.; Spakowitz, Andrew J.; Theriot, Julie A.

    2012-01-01

    Single particle tracking is a powerful technique for investigating the dynamic behavior of biological molecules. However, many of the analytical tools are prone to generate results that can lead to mistaken interpretations of the underlying transport process. Here, we explore the effects of localization error and confinement on the velocity autocorrelation function, Cυ. We show that calculation of Cυ across a range of discretizations can distinguish the effects of localization error, confinement, and medium elasticity. Thus, under certain regimes, Cυ can be used as a diagnostic tool to identify the underlying mechanism of anomalous diffusion. Finally, we apply our analysis to experimental data sets of chromosomal loci and RNA-protein particles in Escherichia coli. PMID:22713559

  4. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  5. Reconceptualizing vulnerability: deconstruction and reconstruction as a postmodern feminist analytical research method.

    PubMed

    Glass, Nel; Davis, Kierrynn

    2004-01-01

    Nursing research informed by postmodern feminist perspectives has prompted many debates in recent times. While this is so, nurse researchers who have been tempted to break new ground have had few examples of appropriate analytical methods for a research design informed by the above perspectives. This article presents a deconstructive/reconstructive secondary analysis of a postmodern feminist ethnography in order to provide an analytical exemplar. In doing so, previous notions of vulnerability as a negative state have been challenged and reconstructed. PMID:15206680

  6. Microfluidics as a tool for C. elegans research.

    PubMed

    San-Miguel, Adriana; Lu, Hang

    2013-09-24

    Microfluidics has emerged as a set of powerful tools that have greatly advanced some areas of biological research, including research using C. elegans. The use of microfluidics has enabled many experiments that are otherwise impossible with conventional methods. Today there are many examples that demonstrate the main advantages of using microfluidics for C. elegans research, achieving precise environmental conditions and facilitating worm handling. Examples range from behavioral analysis under precise chemical or odor stimulation, locomotion studies in well-defined structural surroundings, and even long-term culture on chip. Moreover, microfluidics has enabled coupling worm handling and imaging thus facilitating genetic screens, optogenetic studies, and laser ablation experiments. In this article, we review some of the applications of microfluidics for C. elegans research and provide guides for the design, fabrication, and use of microfluidic devices for C. elegans research studies.

  7. Integrating research training into residency: tools of human investigation.

    PubMed

    Oxnard, Geoffrey R; Zinkus, Tanya Milosh; Bazari, Hasan; Wolf, Myles

    2009-09-01

    Although the need for new physician-clinical scientists has never been greater, significant obstacles deter young physicians from careers in clinical research. Local and federal programs have sought to stimulate interest in clinical research among young physicians, medical students, and even undergraduates, but few formal programs have specifically focused on stimulating interest among residents in training. The recent implementation of strict duty hours regulations has provided residents with additional time to focus on career choices, and this has created an opportunity for training programs to offer new educational initiatives during residency. The authors present Tools of Human Investigation (THI), a two-week rotation offered during the second year of residency. The goals of THI are to provide seminar-based exposure to research methodologies, to impart the tools required to critically appraise the scientific literature, and to provide a small-group forum for career discussions. These three goals are achieved by drawing on a group of research faculty to lead sessions that combine didactics with career development guidance. A course like THI is one innovative way to stimulate interest in human research during residency that could help bridge the discontinuity between the research explorations promoted during medical school and the rigorous expectations of fellowship.

  8. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  9. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  10. Analytical continuation in physical geodesy constructed by means of tools and formulas related to an ellipsoid of revolution

    NASA Astrophysics Data System (ADS)

    Holota, Petr; Nesvadba, Otakar

    2014-05-01

    In physical geodesy mathematical tools applied for solving problems of potential theory are often essentially associated with the concept of the so-called spherical approximation (interpreted as a mapping). The same holds true for the method of analytical (harmonic) continuation which is frequently considered as a means suitable for converting the ground gravity anomalies or disturbances to corresponding values on the level surface that is close to the original boundary. In the development and implementation of this technique the key role has the representation of a harmonic function by means of the famous Poisson's formula and the construction of a radial derivative operator on the basis of this formula. In this contribution an attempt is made to avoid spherical approximation mentioned above and to develop mathematical tools that allow implementation of the concept of analytical continuation also in a more general case, in particular for converting the ground gravity anomalies or disturbances to corresponding values on the surface of an oblate ellipsoid of revolution. The respective integral kernels are constructed with the aid of series of ellipsoidal harmonics and their summation, but also the mathematical nature of the boundary date is discussed in more details.

  11. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  12. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  13. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  14. Genephony: a knowledge management tool for genome-wide research

    PubMed Central

    Nuzzo, Angelo; Riva, Alberto

    2009-01-01

    Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets. PMID:19728881

  15. FOSS Tools for Research Infrastructures - A Success Story?

    NASA Astrophysics Data System (ADS)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  16. Mineotaur: a tool for high-content microscopy screen sharing and visual analytics.

    PubMed

    Antal, Bálint; Chessel, Anatole; Carazo Salas, Rafael E

    2015-01-01

    High-throughput/high-content microscopy-based screens are powerful tools for functional genomics, yielding intracellular information down to the level of single-cells for thousands of genotypic conditions. However, accessing their data requires specialized knowledge and most often that data is no longer analyzed after initial publication. We describe Mineotaur ( http://www.mineotaur.org ), a open-source, downloadable web application that allows easy online sharing and interactive visualisation of large screen datasets, facilitating their dissemination and further analysis, and enhancing their impact.

  17. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  18. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    PubMed

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  19. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    PubMed

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  20. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  1. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  2. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  3. A graphical tool for an analytical approach of scattering photons by the Compton effect

    NASA Astrophysics Data System (ADS)

    Scannavino, Francisco A.; Cruvinel, Paulo E.

    2012-05-01

    The photons scattered by the Compton effect can be used to characterize the physical properties of a given sample due to the influence that the electron density exerts on the number of scattered photons. However, scattering measurements involve experimental and physical factors that must be carefully analyzed to predict uncertainty in the detection of Compton photons. This paper presents a method for the optimization of the geometrical parameters of an experimental arrangement for Compton scattering analysis, based on its relations with the energy and incident flux of the X-ray photons. In addition, the tool enables the statistical analysis of the information displayed and includes the coefficient of variation (CV) measurement for a comparative evaluation of the physical parameters of the model established for the simulation.

  4. Ethics: the risk management tool in clinical research.

    PubMed

    Wadlund, Jill; Platt, Leslie A

    2002-01-01

    Scientific discovery and knowledge expansion in the post genome era holds great promise for new medical technologies and cellular-based therapies with multiple applications that will save and enhance lives. While human beings long have hoped to unlock the mysteries of the molecular basis of life; our society is now on the verge of doing so. But new scientific and technological breakthroughs often come with some risks attached. Research--especially clinical trials and research involving human participants--must be conducted in accordance with the highest ethical and scientific principles. Yet, as the number and complexity of clinical trials increase, so do pressures for new revenue sources and shorter product development cycles, which could have an adverse impact on patient safety. This article explores the use of risk management tools in clinical research.

  5. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  6. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation.

    PubMed

    Bergeron, Dominic; Tremblay, A-M S

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ^{2} with respect to α, and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software. PMID:27627408

  7. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  8. Critical Race Theory and Interest Convergence as Analytic Tools in Teacher Education Policies and Practices

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV

    2008-01-01

    In "The Report of the AERA Panel on Research and Teacher Education," Cochran-Smith and Zeichner's (2005) review of studies in the field of teacher education revealed that many studies lacked theoretical and conceptual grounding. The author argues that Derrick Bell's (1980) interest convergence, a principle of critical race theory, can be used as…

  9. Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools

    ERIC Educational Resources Information Center

    García, Olga Arranz; Secades, Vidal Alonso

    2013-01-01

    In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…

  10. Scientific research tools as an aid to Antarctic logistics

    NASA Astrophysics Data System (ADS)

    Dinn, Michael; Rose, Mike; Smith, Andrew; Fleming, Andrew; Garrod, Simon

    2013-04-01

    Logistics have always been a vital part of polar exploration and research. The more efficient those logistics can be made, the greater the likelihood that research programmes will be delivered on time, safely and to maximum scientific effectiveness. Over the last decade, the potential for symbiosis between logistics and some of the scientific research methods themselves, has increased remarkably; suites of scientific tools can help to optimise logistic efforts, thereby enhancing the effectiveness of further scientific activity. We present one recent example of input to logistics from scientific activities, in support of the NERC iSTAR Programme, a major ice sheet research effort in West Antarctica. We used data output from a number of research tools, spanning a range of techniques and international agencies, to support the deployment of a tractor-traverse system into a remote area of mainland Antarctica. The tractor system was deployed from RRS Ernest Shackleton onto the Abbot Ice Shelf then driven inland to the research area in Pine Island Glacier Data from NASA ICEBRIDGE were used to determine the ice-front freeboard and surface gradients for the traverse route off the ice shelf and onwards into the continent. Quickbird high resolution satellite imagery provided clear images of route track and some insight into snow surface roughness. Polarview satellite data gave sea ice information in the Amundsen Sea, both the previous multi-annual historical characteristics and for real-time information during deployment. Likewise meteorological data contributed historical and information and was used during deployment. Finally, during the tractors' inland journey, ground-based high frequency radar was used to determine a safe, crevasse-free route.

  11. The Research Tools of the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Berriman, G. B.; Lazio, T. J.; Project, VAO

    2013-01-01

    Astronomy is being transformed by the vast quantities of data, models, and simulations that are becoming available to astronomers at an ever-accelerating rate. The U.S. Virtual Astronomical Observatory (VAO) has been funded to provide an operational facility that is intended to be a resource for discovery and access of data, and to provide science services that use these data. Over the course of the past year, the VAO has been developing and releasing for community use five science tools: 1) "Iris", for dynamically building and analyzing spectral energy distributions, 2) a web-based data discovery tool that allows astronomers to identify and retrieve catalog, image, and spectral data on sources of interest, 3) a scalable cross-comparison service that allows astronomers to conduct pair-wise positional matches between very large catalogs stored remotely as well as between remote and local catalogs, 4) time series tools that allow astronomers to compute periodograms of the public data held at the NASA Star and Exoplanet Database (NStED) and the Harvard Time Series Center, and 5) A VO-aware release of the Image Reduction and Analysis Facility (IRAF) that provides transparent access to VO-available data collections and is SAMP-enabled, so that IRAF users can easily use tools such as Aladin and Topcat in conjuction with IRAF tasks. Additional VAO services will be built to make it easy for researchers to provide access to their data in VO-compliant ways, to build VO-enabled custom applications in Python, and to respond generally to the growing size and complexity of astronomy data. Acknowledgements: The Virtual Astronomical Observatory (VAO) is managed by the VAO, LLC, a non-profit company established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. The VAO is sponsored by the National Science Foundation and the National Aeronautics and Space Administration.

  12. Exploring positioning as an analytical tool for understanding becoming mathematics teachers' identities

    NASA Astrophysics Data System (ADS)

    Skog, Kicki; Andersson, Annica

    2015-03-01

    The aim of this article is to explore how a sociopolitical analysis can contribute to a deeper understanding of critical aspects for becoming primary mathematics teachers' identities during teacher education. The question we ask is the following: How may power relations in university settings affect becoming mathematics teachers' subject positioning? We elaborate on the elusive and interrelated concepts of identity, positioning and power, seen as dynamic and changeable. As these concepts represent three interconnected parts of research analysis in an on-going larger project data from different sources will be used in this illustration. In this paper, we clarify the theoretical stance, ground the concepts historically and strive to connect them to research analysis. In this way, we show that power relations and subject positioning in social settings are critical aspects and need to be taken seriously into account if we aim at understanding becoming teachers' identities.

  13. The Mosquito Online Advanced Analytic Service: a case study for school research projects in Thailand.

    PubMed

    Wongkoon, Siriwan; Jaroensutasinee, Mullica; Jaroensutasinee, Krisanadej

    2013-07-01

    The Mosquito Online Advanced Analytic Service (MOAAS) provides an essential tool for querying, analyzing, and visualizing patterns of mosquito larval distribution in Thailand. The MOAAS was developed using Structured Query Language (SQL) technology as a web-based tool for data entry and data access, webMathematica technology for data analysis and data visualization, and Google Earth and Google Maps for Geographic Information System (GIS) visualization. Fifteen selected schools in Thailand provided test data for MOAAS. Users performed data entry using the web-service, data analysis, and data visualization tools with webMathematica, data visualization with bar charts, mosquito larval indices, and three-dimensional (3D) bar charts overlaying on the Google Earth and Google Maps. The 3D bar charts of the number of mosquito larvae were displayed along with spatial information. The mosquito larvae information may be useful for dengue control efforts and health service communities for planning and operational activities.

  14. The Mosquito Online Advanced Analytic Service: a case study for school research projects in Thailand.

    PubMed

    Wongkoon, Siriwan; Jaroensutasinee, Mullica; Jaroensutasinee, Krisanadej

    2013-07-01

    The Mosquito Online Advanced Analytic Service (MOAAS) provides an essential tool for querying, analyzing, and visualizing patterns of mosquito larval distribution in Thailand. The MOAAS was developed using Structured Query Language (SQL) technology as a web-based tool for data entry and data access, webMathematica technology for data analysis and data visualization, and Google Earth and Google Maps for Geographic Information System (GIS) visualization. Fifteen selected schools in Thailand provided test data for MOAAS. Users performed data entry using the web-service, data analysis, and data visualization tools with webMathematica, data visualization with bar charts, mosquito larval indices, and three-dimensional (3D) bar charts overlaying on the Google Earth and Google Maps. The 3D bar charts of the number of mosquito larvae were displayed along with spatial information. The mosquito larvae information may be useful for dengue control efforts and health service communities for planning and operational activities. PMID:24050090

  15. Designing and implementing full immersion simulation as a research tool.

    PubMed

    Munroe, Belinda; Buckley, Thomas; Curtis, Kate; Morris, Richard

    2016-05-01

    Simulation is a valuable research tool used to evaluate the clinical performance of devices, people and systems. The simulated setting may address concerns unique to complex clinical environments such as the Emergency Department, which make the conduct of research challenging. There is limited evidence available to inform the development of simulated clinical scenarios for the purpose of evaluating practice in research studies, with the majority of literature focused on designing simulated clinical scenarios for education and training. Distinct differences exist in scenario design when implemented in education compared with use in clinical research studies. Simulated scenarios used to assess practice in clinical research must not comprise of any purposeful or planned teaching and be developed with a high degree of validity and reliability. A new scenario design template was devised to develop two standardised simulated clinical scenarios for the evaluation of a new assessment framework for emergency nurses. The scenario development and validation processes undertaken are described and provide an evidence-informed guide to scenario development for future clinical research studies. PMID:26917415

  16. Analytical aerodynamic model of a high alpha research vehicle wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Cao, Jichang; Garrett, Frederick, Jr.; Hoffman, Eric; Stalford, Harold

    1990-01-01

    A 6 DOF analytical aerodynamic model of a high alpha research vehicle is derived. The derivation is based on wind-tunnel model data valid in the altitude-Mach flight envelope centered at 15,000 ft altitude and 0.6 Mach number with Mach range between 0.3 and 0.9. The analytical models of the aerodynamics coefficients are nonlinear functions of alpha with all control variable and other states fixed. Interpolation is required between the parameterized nonlinear functions. The lift and pitching moment coefficients have unsteady flow parts due to the time range of change of angle-of-attack (alpha dot). The analytical models are plotted and compared with their corresponding wind-tunnel data. Piloted simulated maneuvers of the wind-tunnel model are used to evaluate the analytical model. The maneuvers considered are pitch-ups, 360 degree loaded and unloaded rolls, turn reversals, split S's, and level turns. The evaluation finds that (1) the analytical model is a good representation at Mach 0.6, (2) the longitudinal part is good for the Mach range 0.3 to 0.9, and (3) the lateral part is good for Mach numbers between 0.6 and 0.9. The computer simulations show that the storage requirement of the analytical model is about one tenth that of the wind-tunnel model and it runs twice as fast.

  17. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research

    PubMed Central

    Torous, John; Kiang, Mathew V; Lorme, Jeanette

    2016-01-01

    Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Objective Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. Methods We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. Results We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Conclusions Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health. PMID:27150677

  18. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  19. The Effects of Incentives on Workplace Performance: A Meta-Analytic Review of Research Studies

    ERIC Educational Resources Information Center

    Condly, Steven J.; Clark, Richard E.; Stolovitch, Harold D.

    2003-01-01

    A meta-analytic review of all adequately designed field and laboratory research on the use of incentives to motivate performance is reported. Of approximately 600 studies, 45 qualified. The overall average effect of all incentive programs in all work settings and on all work tasks was a 22% gain in performance. Team-directed incentives had a…

  20. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    ERIC Educational Resources Information Center

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  1. Instruments Used in Doctoral Dissertations in Educational Sciences in Turkey: Quality of Research and Analytical Errors

    ERIC Educational Resources Information Center

    Karadag, Engin

    2011-01-01

    The aim of this study was to define the level of quality and types of analytical errors for measurement instruments used [i.e., interview forms, achievement tests and scales] in doctoral dissertations produced in educational sciences in Turkey. The study was designed to determine the levels of factors concerning quality in research methods and the…

  2. Vaccinia Virus: A Tool for Research and Vaccine Development

    NASA Astrophysics Data System (ADS)

    Moss, Bernard

    1991-06-01

    Vaccinia virus is no longer needed for smallpox immunization, but now serves as a useful vector for expressing genes within the cytoplasm of eukaryotic cells. As a research tool, recombinant vaccinia viruses are used to synthesize biologically active proteins and analyze structure-function relations, determine the targets of humoral- and cell-mediated immunity, and investigate the immune responses needed for protection against specific infectious diseases. When more data on safety and efficacy are available, recombinant vaccinia and related poxviruses may be candidates for live vaccines and for cancer immunotherapy.

  3. Tissue fluid pressures - From basic research tools to clinical applications

    NASA Technical Reports Server (NTRS)

    Hargens, Alan R.; Akeson, Wayne H.; Mubarak, Scott J.; Owen, Charles A.; Gershuni, David H.

    1989-01-01

    This paper describes clinical applications of two basic research tools developed and refined in the past 20 years: the wick catheter (for measuring tissue fluid pressure) and the colloid osmometer (for measuring osmotic pressure). Applications of the osmometer include estimations of the reduced osmotic pressure of sickle-cell hemoglobin with deoxygenation, and of reduced swelling pressure of human nucleus pulposus with hydration or upon action of certain enzymes. Clinical uses of the wick-catheter technique include an improvement of diagnosis and treatment of acute and chronic compartment syndromes, the elucidation of the tissue pressure thresholds for neuromuscular dysfunction, and the development of a better tourniquet for orthopedics.

  4. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  5. The management and exploitation of naturally light-emitting bacteria as a flexible analytical tool: A tutorial.

    PubMed

    Bolelli, L; Ferri, E N; Girotti, S

    2016-08-31

    Conventional detection of toxic contaminants on surfaces, in food, and in the environment takes time. Current analytical approaches to chemical detection can be of limited utility due to long detection times, high costs, and the need for a laboratory and trained personnel. A non-specific but easy, rapid, and inexpensive screening test can be useful to quickly classify a specimen as toxic or non toxic, so prompt appropriate measures can be taken, exactly where required. The bioluminescent bacteria-based tests meet all these characteristics. Bioluminescence methods are extremely attractive because of their high sensitivity, speed, ease of implementation, and statistical significance. They are usually sensitive enough to detect the majority of pollutants toxic to humans and mammals. This tutorial provides practical guidelines for isolating, cultivating, and exploiting marine bioluminescent bacteria as a simple and versatile analytical tool. Although mostly applied for aqueous phase sample and organic extracts, the test can also be conducted directly on soil and sediment samples so as to reflect the true toxicity due to the bioavailability fraction. Because tests can be performed with freeze-dried cell preparations, they could make a major contribution to field screening activity. They can be easily conducted in a mobile environmental laboratory and may be adaptable to miniaturized field instruments and field test kits. PMID:27506340

  6. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  7. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  8. Analytic element ground water modeling as a research program (1980 to 2006).

    PubMed

    Kraemer, Stephen R

    2007-01-01

    Scientists and engineers who use the analytic element method (AEM) for solving problems of regional ground water flow may be considered a community, and this community can be studied from the perspective of history and philosophy of science. Applying the methods of the Hungarian philosopher of science Imre Lakatos (1922 to 1974), the AEM "research program" is distinguished by its hard core (theoretical basis), protective belt (auxiliary assumptions), and heuristic (problem solving machinery). AEM has emerged relatively recently in the scientific literature and has a relatively modest number of developers and practitioners compared to the more established finite-element and finite-difference methods. Nonetheless, there is evidence to support the assertion that the AEM research program remains in a progressive phase. The evidence includes an expanding publication record, a growing research strand following Professor Otto Strack's book Groundwater Mechanics (1989), the continued placement of AEM researchers in academia, and the further development of innovative analytical solutions and computational solvers/models.

  9. Development of an analytical tool to study power quality of AC power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. Alan; Kankam, M. David

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of the Electric Power Research Institute's HARMFLO program which assumes a three phase, balanced, AC system with loads of harmonic distortion. The modified power flow program can be used with single phase, AC systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20 kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of present results with published data. Although the results are not exact, the discrepancies are relatively small.

  10. Development of an analytical tool to study power quality of ac power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. A.; Kankam, M. D.

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of Electric Power Research Institute's HARMFLO program which assumes a three-phase, balanced, ac system with loads of harmonic distortion. The modified power flow program can be used with single phase, ac systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20-kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of the present results with published data. Although the results are not exact, the discrepancies are relatively small.

  11. Visualising the past: potential applications of Geospatial tools to paleoclimate research

    NASA Astrophysics Data System (ADS)

    Cook, A.; Turney, C. S.

    2012-12-01

    Recent advances in geospatial data acquisition, analysis and web-based data sharing offer new possibilities for understanding and visualising past modes of change. The availability, accessibility and cost-effectiveness of data is better than ever. Researchers can access remotely sensed data including terrain models; use secondary data from large consolidated repositories; make more accurate field measurements and combine data from disparate sources to form a single asset. An increase in the quantity and consistency of data is coupled with subtle yet significant improvements to the way in which geospatial systems manage data interoperability, topological and textual integrity, resulting in more stable analytical and modelling environments. Essentially, researchers now have greater control and more confidence in analytical tools and outputs. Web-based data sharing is growing rapidly, enabling researchers to publish and consume data directly into their spatial systems through OGC-compliant Web Map Services (WMS), Web Feature Services (WFS) and Web Coverage Services (WCS). This has been implemented at institutional, organisational and project scale around the globe. Some institutions have gone one step further and established Spatial Data Infrastructures (SDI) based on Federated Data Structures where the participating data owners retain control over who has access to what. It is important that advances in knowledge are transferred to audiences outside the scientific community in a way that is interesting and meaningful. The visualisation of paleodata through multi-media offers significant opportunities to highlight the parallels and distinctions between past climate dynamics and the challenges of today and tomorrow. Here we present an assessment of key innovations that demonstrate how Geospatial tools can be applied to palaeo-research and used to communicate the results to a diverse array of audiences in the digital age.

  12. Basic statistical tools in research and data analysis

    PubMed Central

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-01-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis. PMID:27729694

  13. Femtosecond pulse shaping as analytic tool in mass spectrometry of complex polyatomic systems

    NASA Astrophysics Data System (ADS)

    Laarmann, Tim; Shchatsinin, Ihar; Singh, Pushkar; Zhavoronkov, Nickolai; Schulz, Claus Peter; Hertel, Ingolf Volker

    2008-04-01

    An additional dimension to mass spectrometric studies on building blocks of proteins is discussed in this paper. The present approach is based on tailored femtosecond laser pulses, using the concept of strong-field pulse shaping in an adaptive feedback loop. We show that control strategies making use of coherent properties of the electromagnetic wave allow one to break pre-selected backbone bonds in amino acid complexes that may be regarded as peptide model systems. Studies on different chromophores, such as phenylalanine and alanine, while keeping the backbone structure unchanged elucidates the effect of the excitation dynamics on the relaxation pathways. The observation of protonated species in the corresponding mass spectra indicates that optimal control of ultrafast laser pulses may even be useful to study intramolecular reactions such as hydrogen- or proton-transfer in particular cases. This opens new perspectives for biophysical and biochemical research, since these photochemical reactions are suggested to explain, e.g. photostability of DNA.

  14. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  15. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  16. ARM Climate Research Facility: Outreach Tools and Strategies

    NASA Astrophysics Data System (ADS)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  17. Analytical tools for planning cost-effective surveillance in Gambiense sleeping sickness.

    PubMed

    Shaw, A P; Cattand, P

    2001-01-01

    The re-emergence of sleeping sickness as a major health problem in parts of Africa, combined with the new sources of financial support and provision of drugs means that an investigation of the cost-effectiveness of the different approaches is timely. There has been very little work done on the economics of controlling either form of sleeping sickness. This paper builds on work done for WHO by the authors on developing a framework for analysing the cost-effectiveness of different methods for surveillance in gambiense sleeping sickness. The framework has been used to build a spreadsheet which makes it possible to simulate the effects of controlling the disease at different prevalences, for example using mobile teams or various forms of fixed post surveillance and screening different proportions of the population in a year. Prices, control strategies, prevalence, sensitivity and specificity of tests are all variables which can be altered to suit different situations or investigate how different approaches perform. As new research is beginning to produce calculations of the burden of sleeping sickness, in terms of disability-adjusted life years (DALY) potentially averted by controlling the disease, it is possible to combine these DALY estimates with the analyses of cost-effectiveness undertaken in these exercises to look at the cost-utility of the work, both to compare different approaches and demonstrate that controlling sleeping sickness represents good value for money as an investment in health.

  18. Game analytics for game user research, part 1: a workshop review and case study.

    PubMed

    El-Nasr, Magy Seif; Desurvire, Heather; Aghabeigi, Bardia; Drachen, Anders

    2013-01-01

    The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design, media studies, and the social sciences. They've extended and modified these methods for different types of digital games, such as social games, casual games, and serious games. This article focuses on quantitative analytics of in-game behavioral user data and its emergent use by the GUR community. The article outlines open problems emerging from several GUR workshops. In addition, a case study of a current collaboration between researchers and a game company demonstrates game analytics' use and benefits.

  19. High-resolution entrainment mapping of gastric pacing: a new analytical tool.

    PubMed

    O'Grady, Gregory; Du, Peng; Lammers, Wim J E P; Egbuji, John U; Mithraratne, Pulasthi; Chen, Jiande D Z; Cheng, Leo K; Windsor, John A; Pullan, Andrew J

    2010-02-01

    Gastric pacing has been investigated as a potential treatment for gastroparesis. New pacing protocols are required to improve symptom and motility outcomes; however, research progress has been constrained by a limited understanding of the effects of electrical stimulation on slow-wave activity. This study introduces high-resolution (HR) "entrainment mapping" for the analysis of gastric pacing and presents four demonstrations. Gastric pacing was initiated in a porcine model (typical amplitude 4 mA, pulse width 400 ms, period 17 s). Entrainment mapping was performed using flexible multielectrode arrays (

  20. Modeling as a research tool in poultry science.

    PubMed

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  1. NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Y. R.; Rasbury, J.; Johnson, J.; Barsten, K.; Saile, L.; Watkins, S. D.

    2011-01-01

    In preparation for exploration-class missions, the Exploration Medical Capability (ExMC) element of NASA's Human Research Program (HRP) has compiled a large evidence base, which previously was available only to persons within the NASA community. The evidence base is comprised of several types of data, for example: information on more than 80 medical conditions which could occur during space flight, derived from several sources (including data on incidence and potential outcomes of these medical conditions, as captured in the Integrated Medical Model's Clinical Finding Forms). In addition, approximately 35 gap reports are included in the evidence base, identifying current understanding of the medical challenges for exploration, as well as any gaps in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions. In an effort to make the ExMC information available to the general public and increase collaboration with subject matter experts within and outside of NASA, ExMC has developed an online collaboration tool, very similar to a wiki, titled the NASA Human Research Wiki. The platform chosen for this data sharing, and the potential collaboration it could generate, is a MediaWiki-based application that would house the evidence, allow "read only" access to all visitors to the website, and editorial access to credentialed subject matter experts who have been approved by the Wiki's editorial board. Although traditional wikis allow users to edit information in real time, the NASA Human Research Wiki includes a peer review process to ensure quality and validity of information. The wiki is also intended to be a pathfinder project for other HRP elements that may want to use this type of web-based tool. The wiki website will be released with a subset of the data described and will continue to be populated throughout the year.

  2. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data. PMID:26513700

  3. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  4. Conservation of Mass: An Important Tool in Renal Research.

    PubMed

    Sargent, John A

    2016-05-01

    The dialytic treatment of end-stage renal disease (ESRD) patients is based on control of solute concentrations and management of fluid volume. The application of the principal of conservation of mass, or mass balance, is fundamental to the study of such treatment and can be extended to chronic kidney disease (CKD) in general. This review discusses the development and use of mass conservation and transport concepts, incorporated into mathematical models. These concepts, which can be applied to a wide range of solutes of interest, represent a powerful tool for quantitatively guided studies of dialysis issues currently and into the future. Incorporating these quantitative concepts in future investigations is key to achieving positive control of known solutes, and in the analysis of such studies; to relate future research to known results of prior studies; and to help in the understanding of the obligatory physiological perturbations that result from dialysis therapy.

  5. THE LASER AS A POTENTIAL TOOL FOR CELL RESEARCH

    PubMed Central

    Rounds, Donald E.; Olson, Robert S.; Johnson, Fred M.

    1965-01-01

    Freshly prepared hemoglobin solutions were successively irradiated up to five times with 1 MW (monochromatic wavelength) of green (530 mµ) laser power. Oxygenated hemoglobin showed no detectable change, but the spectral absorption of reduced hemoglobin showed a shift toward the characteristic curve for the oxygenated form. Intact human erythrocytes exposed to a power density of 110 MW/cm2 of green laser radiation showed no appreciable change in diameter or mass, but they became transparent to a wavelength range from 400 to 600 mµ. A similar power density from a ruby laser failed to produce this bleaching effect. This response in the erythrocyte demonstrates a principle which suggests the laser as a tool for cell research: specific molecular components within a cell may be selectively altered by laser irradiation when an appropriate wavelength and a suitable power density are applied. PMID:5857254

  6. Spacelab - New tool for research and investigations in space

    NASA Technical Reports Server (NTRS)

    Gibson, R.; Lord, D. R.

    1976-01-01

    The Orbiter/Spacelab system is regarded as an entity that provides an orbital platform for experiments in a wide variety of scientific and technological fields. The paper presents a brief summary of the Spacelab system and its operation as an introduction to a new tool to be available to an international community of users through the 1980s. Investigations have shown that all the more conventional fields of space research and applications can benefit from the use of Spacelab, such as astronomy, magnetospheric physics, remote sensing, communications, and many others. It is expected that new areas that would gain from using Spacelab will be recognized as its operational life matures. Numerous sketches and photographs supplement the text.

  7. Review and evaluation of electronic health records-driven phenotype algorithm authoring tools for clinical and translational research

    PubMed Central

    Rasmussen, Luke V; Shaw, Pamela L; Jiang, Guoqian; Kiefer, Richard C; Mo, Huan; Pacheco, Jennifer A; Speltz, Peter; Zhu, Qian; Denny, Joshua C; Pathak, Jyotishman; Thompson, William K; Montague, Enid

    2015-01-01

    Objective To review and evaluate available software tools for electronic health record–driven phenotype authoring in order to identify gaps and needs for future development. Materials and Methods Candidate phenotype authoring tools were identified through (1) literature search in four publication databases (PubMed, Embase, Web of Science, and Scopus) and (2) a web search. A collection of tools was compiled and reviewed after the searches. A survey was designed and distributed to the developers of the reviewed tools to discover their functionalities and features. Results Twenty-four different phenotype authoring tools were identified and reviewed. Developers of 16 of these identified tools completed the evaluation survey (67% response rate). The surveyed tools showed commonalities but also varied in their capabilities in algorithm representation, logic functions, data support and software extensibility, search functions, user interface, and data outputs. Discussion Positive trends identified in the evaluation included: algorithms can be represented in both computable and human readable formats; and most tools offer a web interface for easy access. However, issues were also identified: many tools were lacking advanced logic functions for authoring complex algorithms; the ability to construct queries that leveraged un-structured data was not widely implemented; and many tools had limited support for plug-ins or external analytic software. Conclusions Existing phenotype authoring tools could enable clinical researchers to work with electronic health record data more efficiently, but gaps still exist in terms of the functionalities of such tools. The present work can serve as a reference point for the future development of similar tools. PMID:26224336

  8. Alerting strategies in computerized physician order entry: a novel use of a dashboard-style analytics tool in a children's hospital.

    PubMed

    Reynolds, George; Boyer, Dean; Mackey, Kevin; Povondra, Lynne; Cummings, Allana

    2008-01-01

    Utilizing a commercially available business analytics tool offering dashboard-style graphical indicators and a data warehouse strategy, we have developed an interactive, web-based platform that allows near-real-time analysis of CPOE adoption by hospital area and practitioner specialty. Clinical Decision Support (CDS) metrics include the percentage of alerts that result in a change in clinician decision-making. This tool facilitates adjustments in alert limits in order to reduce alert fatigue.

  9. The GATO gene annotation tool for research laboratories.

    PubMed

    Fujita, A; Massirer, K B; Durham, A M; Ferreira, C E; Sogayar, M C

    2005-11-01

    Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB. PMID:16258624

  10. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    SciTech Connect

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  11. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  12. The relevance of attachment research to psychoanalysis and analytic social psychology.

    PubMed

    Bacciagaluppi, M

    1994-01-01

    The extensive empirical research generated by attachment theory is briefly reviewed, with special reference to transgenerational transmission of attachment patterns, internal working models, cross-cultural, and longitudinal studies. It is claimed that attachment theory and research support the alternative psychoanalytic approach initiated by Ferenczi, especially as regards the re-evaluation of real-life traumatic events, the occurrence of personality splits after childhood trauma, and the aggravation of trauma due to its denial by adults. The concepts of transgenerational transmission and of alternative developmental pathways are further contributions to an alternative psychoanalytic framework. Finally, attention is called to the relevance of the cross-cultural studies to Fromm's analytic social psychology.

  13. Research of the Urban-Rural Integration Evaluation Indicator System Based on Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Zhe, Wang

    It's a key problem need to be solved in the urban-rural integration research to scientifically evaluate the development level of urban-rural integration. Based on the analysis of many factors influencing the urban-rural integration, this article is conducting an empirical research of the evaluation indicator system, as well as applying analytic hierarchy process (AHP). By the means of structuring the judgment matrix, and conducting a consistency test, both the eigenvectors corresponding to the judgment matrix and the specific index weight can be obtained.

  14. Enabling laboratory EUV research with a compact exposure tool

    NASA Astrophysics Data System (ADS)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  15. Expediting the formulation development process with the aid of automated dissolution in analytical research and development.

    PubMed

    Sadowitz, J P

    2001-01-01

    The development of drugs in the generic pharmaceutical industry is a highly competitive arena of companies vying for few drug products that are coming off patent. Companies that have been successful in this arena are those that have met or surpassed the critical timeline associated with trial formulation development, analytical method development, and submission batch manufacturing and testing. Barr Laboratories Inc., has been successful in the generic pharmaceutical industry for several reasons, one of which includes automation. The analytical research and development at Barr has employed the use of automated dissolution early in the lifecycle of a potential product. This approach has dramatically reduced the 'time to market' on average for a number of products. The key to this approach is the network infrastructure of the formulation and analytical research and development departments. At Barr, the cooperative ability to work and communicate together has driven the departments to streamline and matrix their work efforts and optimize resources and time. The discussion will reference how Barr has been successful with automation and gives a case study of products that have moved with rapid pace through the development cycle.

  16. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  17. Stem diameter variations as a versatile research tool in ecophysiology.

    PubMed

    De Swaef, Tom; De Schepper, Veerle; Vandegehuchte, Maurits W; Steppe, Kathy

    2015-10-01

    High-resolution stem diameter variations (SDV) are widely recognized as a useful drought stress indicator and have therefore been used in many irrigation scheduling studies. More recently, SDV have been used in combination with other plant measurements and biophysical modelling to study fundamental mechanisms underlying whole-plant functioning and growth. The present review aims to scrutinize the important insights emerging from these more recent SDV applications to identify trends in ongoing fundamental research. The main mechanism underlying SDV is variation in water content in stem tissues, originating from reversible shrinkage and swelling of dead and living tissues, and irreversible growth. The contribution of different stem tissues to the overall SDV signal is currently under debate and shows variation with species and plant age, but can be investigated by combining SDV with state-of-the-art technology like magnetic resonance imaging. Various physiological mechanisms, such as water and carbon transport, and mechanical properties influence the SDV pattern, making it an extensive source of information on dynamic plant behaviour. To unravel these dynamics and to extract information on plant physiology or plant biophysics from SDV, mechanistic modelling has proved to be valuable. Biophysical models integrate different mechanisms underlying SDV, and help us to explain the resulting SDV signal. Using an elementary modelling approach, we demonstrate the application of SDV as a tool to examine plant water relations, plant hydraulics, plant carbon relations, plant nutrition, freezing effects, plant phenology and dendroclimatology. In the ever-expanding SDV knowledge base we identified two principal research tracks. First, in detailed short-term experiments, SDV measurements are combined with other plant measurements and modelling to discover patterns in phloem turgor, phloem osmotic concentrations, root pressure and plant endogenous control. Second, long-term SDV time

  18. Stem diameter variations as a versatile research tool in ecophysiology.

    PubMed

    De Swaef, Tom; De Schepper, Veerle; Vandegehuchte, Maurits W; Steppe, Kathy

    2015-10-01

    High-resolution stem diameter variations (SDV) are widely recognized as a useful drought stress indicator and have therefore been used in many irrigation scheduling studies. More recently, SDV have been used in combination with other plant measurements and biophysical modelling to study fundamental mechanisms underlying whole-plant functioning and growth. The present review aims to scrutinize the important insights emerging from these more recent SDV applications to identify trends in ongoing fundamental research. The main mechanism underlying SDV is variation in water content in stem tissues, originating from reversible shrinkage and swelling of dead and living tissues, and irreversible growth. The contribution of different stem tissues to the overall SDV signal is currently under debate and shows variation with species and plant age, but can be investigated by combining SDV with state-of-the-art technology like magnetic resonance imaging. Various physiological mechanisms, such as water and carbon transport, and mechanical properties influence the SDV pattern, making it an extensive source of information on dynamic plant behaviour. To unravel these dynamics and to extract information on plant physiology or plant biophysics from SDV, mechanistic modelling has proved to be valuable. Biophysical models integrate different mechanisms underlying SDV, and help us to explain the resulting SDV signal. Using an elementary modelling approach, we demonstrate the application of SDV as a tool to examine plant water relations, plant hydraulics, plant carbon relations, plant nutrition, freezing effects, plant phenology and dendroclimatology. In the ever-expanding SDV knowledge base we identified two principal research tracks. First, in detailed short-term experiments, SDV measurements are combined with other plant measurements and modelling to discover patterns in phloem turgor, phloem osmotic concentrations, root pressure and plant endogenous control. Second, long-term SDV time

  19. Conducting qualitative research in the British Armed Forces: theoretical, analytical and ethical implications.

    PubMed

    Finnegan, Alan

    2014-06-01

    The aim of qualitative research is to produce empirical evidence with data collected through means such as interviews and observation. Qualitative research encourages diversity in the way of thinking and the methods used. Good studies produce a richness of data to provide new knowledge or address extant problems. However, qualitative research resulting in peer review publications within the Defence Medical Services (DMS) is a rarity. This article aims to help redress this balance by offering direction regarding qualitative research in the DMS with a focus on choosing a theoretical framework, analysing the data and ethical approval. Qualitative researchers need an understanding of the paradigms and theories that underpin methodological frameworks, and this article includes an overview of common theories in phenomenology, ethnography and grounded theory, and their application within the military. It explains qualitative coding: the process used to analyse data and shape the analytical framework. A popular four phase approach with examples from an operational nursing research study is presented. Finally, it tackles the issue of ethical approval for qualitative studies and offers direction regarding the research proposal and participant consent. The few qualitative research studies undertaken in the DMS have offered innovative insights into defence healthcare providing information to inform and change educational programmes and clinical practice. This article provides an extra resource for clinicians to encourage studies that will improve the operational capability of the British Armed Forces. It is anticipated that these guidelines are transferable to research in other Armed Forces and the military Veterans population.

  20. The capsicum transcriptome DB: a "hot" tool for genomic research.

    PubMed

    Góngora-Castillo, Elsa; Fajardo-Jaime, Rubén; Fernández-Cortes, Araceli; Jofre-Garfias, Alba E; Lozoya-Gloria, Edmundo; Martínez, Octavio; Ochoa-Alejo, Neftalí; Rivera-Bustamante, Rafael

    2012-01-01

    Chili pepper (Capsicum annuum) is an economically important crop with no available public genome sequence. We describe a genomic resource to facilitate Capsicum annuum research. A collection of Expressed Sequence Tags (ESTs) derived from five C. annuum organs (root, stem, leaf, flower and fruit) were sequenced using the Sanger method and multiple leaf transcriptomes were deeply sampled using with GS-pyrosequencing. A hybrid assembly of 1,324,516 raw reads yielded 32,314 high quality contigs as validated by coverage and identity analysis with existing pepper sequences. Overall, 75.5% of the contigs had significant sequence similarity to entries in nucleic acid and protein databases; 23% of the sequences have not been previously reported for C. annuum and expand sequence resources for this species. A MySQL database and a user-friendly Web interface were constructed with search-tools that permit queries of the ESTs including sequence, functional annotation, Gene Ontology classification, metabolic pathways, and assembly information. The Capsicum Transcriptome DB is free available from http://www.bioingenios.ira.cinvestav.mx:81/Joomla/

  1. RTNS-II - a fusion materials research tool

    NASA Astrophysics Data System (ADS)

    Logan, C. M.; Heikkinen, D. W.

    1982-09-01

    Rotating Target Neutron Source-II (RTNS-II) is a national facility for fusion materials research. It contains two 14 MeV neutron sources. Deuterons are accelerated to ˜ 400 keV and transported to a rotating titanium tritide target. Present source strength is greater than 1 × 10 13 n/s and source diameter is 1 cm fwhm. An air-levitated vacuum seal permits rotation of the target at 5000 rpm with negligible impact on accelerator vacuum system gas load. Targets are cooled by chilled water flowing through internal channels in a copper alloy substrate. Substrates are produced by solid-state diffusion bonding of two sheets, one containing etched cooling channels. An electroforming process is being developed which will reduce substrate cost and improve reliability. Titanium tritide coating thickness is ˜ 10 μm giving an initial tritium inventory for the present 23 cm diameter targets of 3.7 × 10 7 MBq. Operating interval between target changes is typically about 80 h. Thirteen laboratories and universities have participated in the experimental program at RTNS-II. Most measurements have been directed at understanding defect production and low-dose damage microstructure. The principal diagnostic tools have been cryogenic resistivity measurements, mechanical properties assessment and transmission electron microscopy. Some engineering tests have been conducted in support of near-term magnetic confinement experiments and of reactor materials which will see small lifetime doses.

  2. Microgravity as a research tool to improve US agriculture

    NASA Astrophysics Data System (ADS)

    Bula, R. J.; Stankovic, Bratislav

    2000-01-01

    Crop production and utilization are undergoing significant modifications and improvements that emanate from adaptation of recently developed plant biotechnologies. Several innovative technologies will impact US agriculture in the next century. One of these is the transfer of desirable genes from organisms to economically important crop species in a way that cannot be accomplished with traditional plant breeding techniques. Such plant genetic engineering offers opportunities to improve crop species for a number of characteristics as well as use as source materials for specific medical and industrial applications. Although plant genetic engineering is having an impact on development of new crop cultivars, several major constraints limit the application of this technology to selected crop species and genotypes. Consequently, gene transfer systems that overcome these constraints would greatly enhance development of new crop materials. If results of a recent gene transfer experiment conducted in microgravity during a Space Shuttle mission are confirmed, and with the availability of the International Space Station as a permanent space facility, commercial plant transformation activity in microgravity could become a new research tool to improve US agriculture. .

  3. Databases and registers: useful tools for research, no studies.

    PubMed

    Curbelo, Rafael J; Loza, Estíbaliz; de Yébenes, Maria Jesús García; Carmona, Loreto

    2014-04-01

    There are many misunderstandings about databases. Database is a commonly misused term in reference to any set of data entered into a computer. However, true databases serve a main purpose, organising data. They do so by establishing several layers of relationships; databases are hierarchical. Databases commonly organise data over different levels and over time, where time can be measured as the time between visits, or between treatments, or adverse events, etc. In this sense, medical databases are closely related to longitudinal observational studies, as databases allow the introduction of data on the same patient over time. Basically, we could establish four types of databases in medicine, depending on their purpose: (1) administrative databases, (2) clinical databases, (3) registers, and (4) study-oriented databases. But a database is a useful tool for a large variety of studies, not a type of study itself. Different types of databases serve very different purposes, and a clear understanding of the different research designs mentioned in this paper would prevent many of the databases we launch from being just a lot of work and very little science. PMID:24509895

  4. Concept Mapping as a Research Tool to Evaluate Conceptual Change Related to Instructional Methods

    ERIC Educational Resources Information Center

    Miller, Kevin J.; Koury, Kevin A.; Fitzgerald, Gail E.; Hollingsead, Candice; Mitchem, Katherine J.; Tsai, Hui-Hsien; Park, Meeaeng Ko

    2009-01-01

    Concept maps are commonly used in a variety of educational settings as a learning aid or instructional tool. Additionally, their potential as a research tool has been recognized. This article defines features of concept maps, describes the use of pre- and postconcept maps as a research tool, and offers a protocol for employing concept maps as an…

  5. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    PubMed

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  6. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS.

  7. Structural and compositional changes of dissolved organic matter upon solid-phase extraction tracked by multiple analytical tools.

    PubMed

    Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Jung, Heon-Jae; Hur, Jin

    2016-09-01

    Although PPL-based solid-phase extraction (SPE) has been widely used before dissolved organic matter (DOM) analyses via advanced measurements such as ultrahigh resolution Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS), much is still unknown about the structural and compositional changes in DOM pool through SPE. In this study, selected DOM from various sources were tested to elucidate the differences between before and after the SPE utilizing multiple analytical tools including fluorescence spectroscopy, FT-ICR-MS, and size exclusion chromatography with organic carbon detector (SEC-OCD). The changes of specific UV absorbance indicated the decrease of aromaticity after the SPE, suggesting a preferential exclusion of aromatic DOM structures, which was also confirmed by the substantial reduction of fluorescent DOM (FDOM). Furthermore, SEC-OCD results exhibited very low recoveries (1-9 %) for the biopolymer fraction, implying that PPL needs to be used cautiously in SPE sorbent materials for treating high molecular weight compounds (i.e., polysaccharides, proteins, and amino sugars). A careful examination via FT-ICR-MS revealed that the formulas lost by the SPE might be all DOM source-dependent. Nevertheless, the dominant missing compound groups were identified to be the tannins group with high O/C ratios (>0.7), lignins/carboxyl-rich alicyclic molecules (CRAM), aliphatics with high H/C >1.5, and heteroatomic formulas, all of which were prevailed by pseudo-analogous molecular formula families with different methylene (-CH2) units. Our findings shed new light on potential changes in the compound composition and the molecular weight of DOM upon the SPE, implying precautions needed for data interpretation. Graphical Abstract Tracking the characteristics of DOM from various origins upon PPL-based SPE utilizing EEMPARAFAC, SEC-OCD, and FT-ICR-MS. PMID:27387996

  8. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  9. Research subjects for analytical estimation of core degradation at Fukushima-Daiichi nuclear power plant

    SciTech Connect

    Nagase, F.; Ishikawa, J.; Kurata, M.; Yoshida, H.; Kaji, Y.; Shibamoto, Y.; Amaya, M; Okumura, K.; Katsuyama, J.

    2013-07-01

    Estimation of the accident progress and status inside the pressure vessels (RPV) and primary containment vessels (PCV) is required for appropriate conductance of decommissioning in the Fukushima-Daiichi NPP. For that, it is necessary to obtain additional experimental data and revised models for the estimation using computer codes with increased accuracies. The Japan Atomic Energy Agency (JAEA) has selected phenomena to be reviewed and developed, considering previously obtained information, conditions specific to the Fukushima-Daiichi NPP accident, and recent progress of experimental and analytical technologies. As a result, research and development items have been picked up in terms of thermal-hydraulic behavior in the RPV and PCV, progression of fuel bundle degradation, failure of the lower head of RPV, and analysis of the accident. This paper introduces the selected phenomena to be reviewed and developed, research plans and recent results from the JAEA's corresponding research programs. (authors)

  10. Role of nuclear analytical probe techniques in biological trace element research

    SciTech Connect

    Jones, K.W.; Pounds, J.G.

    1985-01-01

    Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace elements metabolism, and of conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research, and examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team. 21 refs., 4 figs., 1 tab.

  11. Concept Maps as a Research and Evaluation Tool To Assess Conceptual Change in Quantum Physics.

    ERIC Educational Resources Information Center

    Sen, Ahmet Ilhan

    2002-01-01

    Informs teachers about using concept maps as a learning tool and alternative assessment tools in education. Presents research results of how students might use concept maps to communicate their cognitive structure. (Author/KHR)

  12. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  13. Giving Raw Data a Chance to Talk: A Demonstration of Exploratory Visual Analytics with a Pediatric Research Database Using Microsoft Live Labs Pivot to Promote Cohort Discovery, Research, and Quality Assessment

    PubMed Central

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V. Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses. PMID:24808811

  14. Giving raw data a chance to talk: a demonstration of exploratory visual analytics with a pediatric research database using Microsoft Live Labs Pivot to promote cohort discovery, research, and quality assessment.

    PubMed

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses.

  15. Research progress of pharmacological activities and analytical methods for plant origin proteins.

    PubMed

    Li, Chun-hong; Chen, Cen; Xia, Zhi-ning; Yang, Feng-qing

    2015-07-01

    As one of the important active components of traditional Chinese medicine (TCM), plant origin active proteins have many significant pharmacological functions. According to researches on the plant origin active proteins reported in recent years, pharmacological effects include anti-tumor, immune regulation, anti-oxidant, anti-pathogeny microorganism, anti-thrombus, as well as hypolipidemic and hypoglycemic activities of plant origin were reviewed, respectively. On the other hand, the analytical methods including chromatography, spectroscopy, electrophoresis and mass spectrometry for plant origin proteins analysis were also summarized. The main purpose of this paper is providing a reference for future development and application of plant active proteins.

  16. Dynamic Visual Acuity: a Functionally Relevant Research Tool

    NASA Technical Reports Server (NTRS)

    Peters, Brian T.; Brady, Rachel A.; Miller, Chris A.; Mulavara, Ajitkumar P.; Wood, Scott J.; Cohen, Helen S.; Bloomberg, Jacob J.

    2010-01-01

    Coordinated movements between the eyes and head are required to maintain a stable retinal image during head and body motion. The vestibulo-ocular reflex (VOR) plays a significant role in this gaze control system that functions well for most daily activities. However, certain environmental conditions or interruptions in normal VOR function can lead to inadequate ocular compensation, resulting in oscillopsia, or blurred vision. It is therefore possible to use acuity to determine when the environmental conditions, VOR function, or the combination of the two is not conductive for maintaining clear vision. Over several years we have designed and tested several tests of dynamic visual acuity (DVA). Early tests used the difference between standing and walking acuity to assess decrements in the gaze stabilization system after spaceflight. Supporting ground-based studies measured the responses from patients with bilateral vestibular dysfunction and explored the effects of visual target viewing distance and gait cycle events on walking acuity. Results from these studies show that DVA is affected by spaceflight, is degraded in patients with vestibular dysfunction, changes with target distance, and is not consistent across the gait cycle. We have recently expanded our research to include studies in which seated subjects are translated or rotated passively. Preliminary results from this work indicate that gaze stabilization ability may differ between similar active and passive conditions, may change with age, and can be affected by the location of the visual target with respect to the axis of motion. Use of DVA as a diagnostic tool is becoming more popular but the functional nature of the acuity outcome measure also makes it ideal for identifying conditions that could lead to degraded vision. By doing so, steps can be taken to alter the problematic environments to improve the man-machine interface and optimize performance.

  17. Advanced Methods in Meta-Analytic Research: Applications and Implications for Rehabilitation Counseling Research

    ERIC Educational Resources Information Center

    Rosenthal, David A.; Hoyt, William T.; Ferrin, James M.; Miller, Susan; Cohen, Nicholas D.

    2006-01-01

    Over the past 25 years, meta-analysis has assumed a significant role in the synthesis of counseling and psychotherapy research through the evaluation and interpretation of the results of multiple studies. An examination of four widely recognized rehabilitation counseling journals, however, reveals that only one meta-analysis (Bolton & Akridge,…

  18. Researcher Effects on Mortality Salience Research: A Meta-Analytic Moderator Analysis

    ERIC Educational Resources Information Center

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-01-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = 0.35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing…

  19. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  20. Applied Analytical Combustion/emissions Research at the NASA Lewis Research Center - a Progress Report

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  1. Research Tools, Tips, and Resources for Financial Aid Administrators. Monograph, A NASFAA Series.

    ERIC Educational Resources Information Center

    Mohning, David D.; Redd, Kenneth E.; Simmons, Barry W., Sr.

    This monograph provides research tools, tips, and resources to financial aid administrators who need to undertake research tasks. It answers: What is research? How can financial aid administrators get started on research projects? What resources are available to help answer research questions quickly and accurately? How can research efforts assist…

  2. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  3. Computer as Research Tools 4.Use Your PC More Effectively

    NASA Astrophysics Data System (ADS)

    Baba, Hajime

    This article shows the useful tools on personal computers. The electronical dictionaries, the full-text search system, the simple usage of the preprint server, and the numeric computation language for applications in engineering and science are introduced.

  4. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  5. Big data, advanced analytics and the future of comparative effectiveness research.

    PubMed

    Berger, Marc L; Doban, Vitalii

    2014-03-01

    The intense competition that accompanied the growth of internet-based companies ushered in the era of 'big data' characterized by major innovations in processing of very large amounts of data and the application of advanced analytics including data mining and machine learning. Healthcare is on the cusp of its own era of big data, catalyzed by the changing regulatory and competitive environments, fueled by growing adoption of electronic health records, as well as efforts to integrate medical claims, electronic health records and other novel data sources. Applying the lessons from big data pioneers will require healthcare and life science organizations to make investments in new hardware and software, as well as in individuals with different skills. For life science companies, this will impact the entire pharmaceutical value chain from early research to postcommercialization support. More generally, this will revolutionize comparative effectiveness research.

  6. A collaborative approach to develop a multi-omics data analytics platform for translational research.

    PubMed

    Schumacher, Axel; Rujan, Tamas; Hoefkens, Jens

    2014-12-01

    The integration and analysis of large datasets in translational research has become an increasingly challenging problem. We propose a collaborative approach to integrate established data management platforms with existing analytical systems to fill the hole in the value chain between data collection and data exploitation. Our proposal in particular ensures data security and provides support for widely distributed teams of researchers. As a successful example for such an approach, we describe the implementation of a unified single platform that combines capabilities of the knowledge management platform tranSMART and the data analysis system Genedata Analyst™. The combined end-to-end platform helps to quickly find, enter, integrate, analyze, extract, and share patient- and drug-related data in the context of translational R&D projects.

  7. MEETING TODAY'S EMERGING CONTAMINANTS WITH TOMORROW'S RESEARCH TOOL

    EPA Science Inventory

    This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park).

  8. Variance decomposition: a tool enabling strategic improvement of the precision of analytical recovery and concentration estimates associated with microorganism enumeration methods.

    PubMed

    Schmidt, P J; Emelko, M B; Thompson, M E

    2014-05-15

    Concentrations of particular types of microorganisms are commonly measured in various waters, yet the accuracy and precision of reported microorganism concentration values are often questioned due to the imperfect analytical recovery of quantitative microbiological methods and the considerable variation among fully replicated measurements. The random error in analytical recovery estimates and unbiased concentration estimates may be attributable to several sources, and knowing the relative contribution from each source can facilitate strategic design of experiments to yield more precise data or provide an acceptable level of information with fewer data. Herein, variance decomposition using the law of total variance is applied to previously published probabilistic models to explore the relative contributions of various sources of random error and to develop tools to aid experimental design. This work focuses upon enumeration-based methods with imperfect analytical recovery (such as enumeration of Cryptosporidium oocysts), but the results also yield insights about plating methods and microbial methods in general. Using two hypothetical analytical recovery profiles, the variance decomposition method is used to explore 1) the design of an experiment to quantify variation in analytical recovery (including the size and precision of seeding suspensions and the number of samples), and 2) the design of an experiment to estimate a single microorganism concentration (including sample volume, effects of improving analytical recovery, and replication). In one illustrative example, a strategically designed analytical recovery experiment with 6 seeded samples would provide as much information as an alternative experiment with 15 seeded samples. Several examples of diminishing returns are illustrated to show that efforts to reduce error in analytical recovery and concentration estimates can have negligible effect if they are directed at trivial error sources.

  9. Typology of Analytical Errors in Qualitative Educational Research: An Analysis of the 2003-2007 Education Science Dissertations in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    In this research, the level of quality of the qualitative research design used and the analytic mistakes made in the doctorate dissertations carried out in the field of education science in Turkey have been tried to be identified. Case study design has been applied in the study in which qualitative research techniques have been used. The universe…

  10. Accelerator mass spectrometry as a bioanalytical tool for nutritional research

    SciTech Connect

    Vogel, J.S.; Turteltaub, K.W.

    1997-09-01

    Accelerator Mass Spectrometry is a mass spectrometric method of detecting long-lived radioisotopes without regard to their decay products or half-life. The technique is normally applied to geochronology, but recently has been developed for bioanalytical tracing. AMS detects isotope concentrations to parts per quadrillion, quantifying labeled biochemicals to attomole levels in milligram- sized samples. Its advantages over non-isotopeic and stable isotope labeling methods are reviewed and examples of analytical integrity, sensitivity, specificity, and applicability are provided.

  11. Analytical combustion/emissions research related to the NASA high-speed research program

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1991-01-01

    Increasing the pressure and temperature of the engines of new generation supersonic airliners increases the emissions of nitrogen oxides to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of implementing low emissions combustor technologies, NASA Lewis Research Center has pursued a combustion analysis program to guide combustor design processes, to identify potential concepts of greatest promise, and to optimize them at low cost, with short turn-around time. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts have been made to improve the code capabilities of modeling the physics. Test cases and experiments are used for code validation. To provide insight into the combustion process and combustor design, two-dimensional and three-dimensional codes such as KIVA-II and LeRC 3D have been used. These codes are operational and calculations have been performed to guide low emissions combustion experiments.

  12. Is research working for you? validating a tool to examine the capacity of health organizations to use research

    PubMed Central

    Kothari, Anita; Edwards, Nancy; Hamel, Nadia; Judd, Maria

    2009-01-01

    Background 'Is research working for you? A self-assessment tool and discussion guide for health services management and policy organizations', developed by the Canadian Health Services Research Foundation, is a tool that can help organizations understand their capacity to acquire, assess, adapt, and apply research. Objectives were to: determine whether the tool demonstrated response variability; describe how the tool differentiated between organizations that were known to be lower-end or higher-end research users; and describe the potential usability of the tool. Methods Thirty-two focus groups were conducted among four sectors of Canadian health organizations. In the first hour of the focus group, participants individually completed the tool and then derived a group consensus ranking on items. In the second hour, the facilitator asked about overall impressions of the tool, to identify insights that emerged during the review of items on the tool and to elicit comments on research utilization. Discussion data were analyzed qualitatively, and individual and consensus item scores were analyzed using descriptive and non-parametric statistics. Results The tool demonstrated good usability and strong response variability. Differences between higher-end and lower-end research use organizations on scores suggested that this tool has adequate discriminant validity. The group discussion based on the tool was the more useful aspect of the exercise, rather than the actual score assigned. Conclusion The tool can serve as a catalyst for an important discussion about research use at the organizational level; such a discussion, in and of itself, demonstrates potential as an intervention to encourage processes and supports for research translation. PMID:19627601

  13. "This Ain't the Projects": A Researcher's Reflections on the Local Appropriateness of Our Research Tools

    ERIC Educational Resources Information Center

    Martinez, Danny C.

    2016-01-01

    In this article I examine the ways in which Black and Latina/o urban high school youth pressed me to reflexively examine my positionality and that of my research tools during a year-long ethnographic study documenting their communicative repertoires. I reflect on youth comments on my researcher tools, as well as myself, in order to wrestle with…

  14. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer

  15. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  16. PARAMO: A Parallel Predictive Modeling Platform for Healthcare Analytic Research using Electronic Health Records

    PubMed Central

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng

    2014-01-01

    Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate

  17. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  18. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  19. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. PMID:27098025

  20. Analytical Validation of AmpliChip p53 Research Test for Archival Human Ovarian FFPE Sections.

    PubMed

    Marton, Matthew J; McNamara, Andrew R; Nikoloff, D Michele; Nakao, Aki; Cheng, Jonathan

    2015-01-01

    The p53 tumor suppressor gene (TP53) is reported to be mutated in nearly half of all tumors and plays a central role in genome integrity. Detection of mutations in p53 can be accomplished by many assays, including the AmpliChip p53 Research Test. The AmpliChip p53 Research Test has been successfully used to determine p53 status in hematologic malignancies and fresh frozen solid tissues but there are few reports of using the assay with formalin fixed, paraffin-embedded (FFPE) tissue. The objective of this study was to describe analytical performance characterization of the AmpliChip p53 Research Test to detect p53 mutations in genomic DNA isolated from archival FFPE human ovarian tumor tissues. Method correlation with sequencing showed 96% mutation-wise agreement and 99% chip-wise agreement. We furthermore observed 100% agreement (113/113) of the most prevalent TP53 mutations. Workflow reproducibility was 96.8% across 8 samples, with 2 operators, 2 reagent lots and 2 instruments. Section-to-section reproducibility was 100% for each sample across a 60 μm region of the FFPE block from ovarian tumors. These data indicate that the AmpliChip p53 Research Test is an accurate and reproducible method for detecting mutations in TP53 from archival FFPE human ovarian specimens.

  1. Analytical Validation of AmpliChip p53 Research Test for Archival Human Ovarian FFPE Sections.

    PubMed

    Marton, Matthew J; McNamara, Andrew R; Nikoloff, D Michele; Nakao, Aki; Cheng, Jonathan

    2015-01-01

    The p53 tumor suppressor gene (TP53) is reported to be mutated in nearly half of all tumors and plays a central role in genome integrity. Detection of mutations in p53 can be accomplished by many assays, including the AmpliChip p53 Research Test. The AmpliChip p53 Research Test has been successfully used to determine p53 status in hematologic malignancies and fresh frozen solid tissues but there are few reports of using the assay with formalin fixed, paraffin-embedded (FFPE) tissue. The objective of this study was to describe analytical performance characterization of the AmpliChip p53 Research Test to detect p53 mutations in genomic DNA isolated from archival FFPE human ovarian tumor tissues. Method correlation with sequencing showed 96% mutation-wise agreement and 99% chip-wise agreement. We furthermore observed 100% agreement (113/113) of the most prevalent TP53 mutations. Workflow reproducibility was 96.8% across 8 samples, with 2 operators, 2 reagent lots and 2 instruments. Section-to-section reproducibility was 100% for each sample across a 60 μm region of the FFPE block from ovarian tumors. These data indicate that the AmpliChip p53 Research Test is an accurate and reproducible method for detecting mutations in TP53 from archival FFPE human ovarian specimens. PMID:26125596

  2. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  3. Chaos Modeling: Increasing Educational Researchers' Awareness of a New Tool.

    ERIC Educational Resources Information Center

    Bobner, Ronald F.; And Others

    Chaos theory is being used as a tool to study a wide variety of phenomena. It is a philosophical and empirical approach that attempts to explain relationships previously thought to be totally random. Although some relationships are truly random, many data appear to be random but reveal repeatable patterns of behavior under further investigation.…

  4. Specially Made for Science: Researchers Develop Online Tools For Collaborations

    ERIC Educational Resources Information Center

    Guterman, Lila

    2008-01-01

    Blogs, wikis, and social-networking sites such as Facebook may get media buzz these days, but for scientists, engineers, and doctors, they are not even on the radar. The most effective tools of the Internet for such people tend to be efforts more narrowly aimed at their needs, such as software that helps geneticists replicate one another's…

  5. [Small compounds libraries: a research tool for chemical biology].

    PubMed

    Florent, Jean-Claude

    2013-01-01

    Obtaining and screening collections of small molecules remain a challenge for biologists. Recent advances in analytical techniques and instrumentation now make screening possible in academia. The history of the creation of such public or commercial collections and their accessibility is related. It shows that there is interest for an academic laboratory involved in medicinal chemistry, chemogenomics or "chemical biology" to organize its own collection and make it available through existing networks such as the French National chimiothèque or the European partner network "European Infrastructure of open screening platforms for Chemical Biology" EU-OpenScreen under construction.

  6. "Mythbusters": A Tool for Teaching Research Methods in Psychology

    ERIC Educational Resources Information Center

    Burkley, Edward; Burkley, Melissa

    2009-01-01

    "Mythbusters" uses multiple research methods to test interesting topics, offering research methods students an entertaining review of course material. To test the effectiveness of "Mythbusters" clips in a psychology research methods course, we systematically selected and showed 4 clips. Students answered questions about the clips, offered their…

  7. Tools for Monitoring Social Media: A Marketing Research Project

    ERIC Educational Resources Information Center

    Veeck, Ann; Hoger, Beth

    2014-01-01

    Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…

  8. Big Data analytics and cognitive computing - future opportunities for astronomical research

    NASA Astrophysics Data System (ADS)

    Garrett, M. A.

    2014-10-01

    The days of the lone astronomer with his optical telescope and photographic plates are long gone: Astronomy in 2025 will not only be multi-wavelength, but multi-messenger, and dominated by huge data sets and matching data rates. Catalogues listing detailed properties of billions of objects will in themselves require a new industrial-scale approach to scientific discovery, requiring the latest techniques of advanced data analytics and an early engagement with the first generation of cognitive computing systems. Astronomers have the opportunity to be early adopters of these new technologies and methodologies - the impact can be profound and highly beneficial to effecting rapid progress in the field. Areas such as SETI research might favourably benefit from cognitive intelligence that does not rely on human bias and preconceptions.

  9. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  10. Research Tool Patents--Rumours of their Death are Greatly Exaggerated

    ERIC Educational Resources Information Center

    Carroll, Peter G.; Roberts, John S.

    2006-01-01

    Using a patented drug during clinical trials is not infringement [35 U.S.C. 271(e)(1)]. Merck v Integra enlarged this "safe harbour" to accommodate preclinical use of drugs and patented "research tools" if "reasonably related" to FDA approval. The decision allowed lower courts, should they wish, to find any use of a research tool, except for…

  11. EXAMPLES OF THE ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  12. Practical library research: a tool for effective library management.

    PubMed Central

    Schneider, E; Mankin, C J; Bastille, J D

    1995-01-01

    Librarians are being urged to conduct research as one of their professional responsibilities. Many librarians, however, avoid research, because they believe it is beyond their capabilities or resources. This paper discusses the importance of conducting applied research-research directed toward solving practical problems. The paper describes how one library conducted practical research projects, including use studies and surveys, over an eighteen-year period. These projects produced objective data that were used by the library to make management decisions that benefited both the library and its parent institution. This paper encourages other librarians to conduct practical research projects and to share the results with their colleagues through publication in the professional literature. PMID:7703934

  13. Use of an enterprise wiki as a research collaboration tool.

    PubMed

    Desai, Bimal R; O'Hara, Ryan T; White, Peter S

    2007-01-01

    Biomedical research projects are highly collaborative endeavors with unique information management and communication needs. We describe the pilot use of an enterprise wiki solution to facilitate group communication, secure file sharing, and collaborative writing within a pediatric hospital and research center. We discuss the choice of software, examples of use, and initial user feedback. We conclude that a wiki is a low-cost and high-yield approach to enhance research collaboration. PMID:18694032

  14. SMART II : the spot market agent research tool version 2.0.

    SciTech Connect

    North, M. J. N.

    2000-12-14

    Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.

  15. Empirical-Analytical Methodological Research in Environmental Education: Response to a Negative Trend in Methodological and Ideological Discussions

    ERIC Educational Resources Information Center

    Connell, Sharon

    2006-01-01

    The purpose of this paper is to contribute to methodological discourse about research approaches to environmental education. More specifically, the paper explores the current status of the "empirical-analytical methodology" and its "positivist" (traditional- and post-positivist) ideologies, in environmental education research through the critical…

  16. Recent and Potential Application of Engineering Tools to Educational Research.

    ERIC Educational Resources Information Center

    Taft, Martin I.

    This paper presents a summary of some recent engineering research in education and identifies some research areas with high payoff potential. The underlying assumption is that a school is a system with a set of subsystems which is potentially susceptible to analysis, design, and eventually some sort of optimization. This assumption leads to the…

  17. Teacher Research as a Practical Tool for Learning to Teach

    ERIC Educational Resources Information Center

    Lysaker, Judith; Thompson, Becky

    2013-01-01

    Teacher research has a long, rich history. However, teacher research is primarily limited to practicing teachers and those pursuing graduate education. It is only beginning to be used as means of understanding the instructional needs of English learners. In this article, a preservice teacher and her university instructor describe the role of…

  18. Somatic Sensitivity and Reflexivity as Validity Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Green, Jill

    2015-01-01

    Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…

  19. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What analytical...

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What analytical...

  1. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ....120 Public Contracts and Property Management Federal Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What analytical...

  2. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  3. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  4. International research through networking: an old idea with new tools.

    PubMed

    Henson, J B; Rodewald, E

    1995-03-01

    The growth and refinement of electronic media capabilities, Internet, other electronic highways, fiber optics, microwaves, and satellites will have major impact on researchers and scholars, facilitating the timely sharing of information. The balance of time saved and money available may be the crucial issues in the rapidity of development. The dissemination of research results to a large audience through electronic journals, bulletin boards and data bases will become a dominant force in the formal publication of such results, with instant feedback from colleagues. The productivity of scientists and the quality of their research will be higher through better communications. Networking, however, is more than communications. It is shared interests and interaction, building on information received and provided and creating a relationship and a knowledge base to enhance international research.

  5. Applying Web-Based Tools for Research, Engineering, and Operations

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  6. On the Use of Factor Analysis as a Research Tool.

    ERIC Educational Resources Information Center

    Benson, Jeri; Nasser, Fadia

    1998-01-01

    Discusses the conceptual/theoretical design, statistical, and reporting issues in choosing factor analysis for research. Provides questions to consider when planning, analyzing, or reporting an exploratory factor analysis study. (SK)

  7. Education of research ethics for clinical investigators with Moodle tool

    PubMed Central

    2013-01-01

    Background In clinical research scientific, legal as well as ethical aspects are important. It is well known that clinical investigators at university hospitals have to undertake their PhD-studies alongside their daily work and reconciling work and study can be challenging. The aim of this project was to create a web based course in clinical research bioethics (5 credits) and to examine whether the method is suitable for teaching bioethics. The course comprised of six modules: an initial examination (to assess knowledge in bioethics), information on research legislation, obtaining permissions from authorities, writing an essay on research ethics, preparing one’s own study protocol, and a final exam. All assignments were designed with an idea of supporting students to reflect on their learning with their own research. Methods 57 PhD-students (medical, nursing and dental sciences) enrolled and 46 completed the course. Course evaluation was done using a questionnaire. The response rate was 78%. Data were analyzed using quantitative methods and qualitative content analysis. Results The course was viewed as useful and technically easy to perform. Students were pleased with the guidance offered. Personal feedback from teachers about students’ own performance was seen advantageous and helped them to appreciate how these aspects could be applied their own studies. The course was also considered valuable for future research projects. Conclusions Ethical issues and legislation of clinical research can be understood more easily when students can reflect the principles upon their own research project. Web based teaching environment is a feasible learning method for clinical investigators. PMID:24330709

  8. [Eating, nourishment and nutrition: instrumental analytic categories in the scientific research field].

    PubMed

    da Veiga Soares Carvalho, Maria Cláudia; Luz, Madel Therezinha; Prado, Shirley Donizete

    2011-01-01

    Eating, nourishment or nutrition circulate in our culture as synonyms and thus do not account for the changes that occur in nourishment, which intended or unintended, have a hybridization pattern that represents a change of rules and food preferences. This paper aims to take these common sense conceptions as analytic categories for analyzing and interpreting research for the Humanities and Health Sciences in a theoretical perspective, through conceptualization. The food is associated with a natural function (biological), a concept in which nature is opposed to culture, and nourishment takes cultural meanings (symbolic), expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues.

  9. The Stuttering Treatment Research Evaluation and Assessment Tool (STREAT): Evaluating Treatment Research as Part of Evidence-Based Practice

    ERIC Educational Resources Information Center

    Davidow, Jason H.; Bothe, Anne K.; Bramlett, Robin E.

    2006-01-01

    Purpose: This article presents, and explains the issues behind, the Stuttering Treatment Research Evaluation and Assessment Tool (STREAT), an instrument created to assist clinicians, researchers, students, and other readers in the process of critically appraising reports of stuttering treatment research. Method: The STREAT was developed by…

  10. The airborne infrared scanner as a geophysical research tool

    USGS Publications Warehouse

    Friedman, Jules D.

    1970-01-01

    The infrared scanner is proving to be an effective anomaly-mapping tool, albeit one which depicts surface emission directly and heat mass transfer from depths only indirectly and at a threshold level 50 to 100 times the normal conductive heat flow of the earth. Moreover, successive terrain observations are affected by time-dependent variables such as the diurnal and seasonal warming and cooling cycle of a point on the earth's surface. In planning precise air borne surveys of radiant flux from the earth's surface, account must be taken of background noise created by variations in micrometeorological factors and emissivity of surface materials, as well as the diurnal temperature cycle. The effect of the diurnal cycle may be minimized by planning predawn aerial surveys. In fact, the diurnal change is very small for most water bodies and the emissivity factor for water (e) =~ 1 so a minimum background noise is characteristic of scanner records of calm water surfaces.

  11. Intellectual Property: a powerful tool to develop biotech research

    PubMed Central

    Giugni, Diego; Giugni, Valter

    2010-01-01

    Summary Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349

  12. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  13. ``Tools for Astrometry": A Windows-based Research Tool for Asteroid Discovery and Measurement

    NASA Astrophysics Data System (ADS)

    Snyder, G. A.; Marschall, L. A.; Good, R. F.; Hayden, M. B.; Cooper, P. R.

    1998-12-01

    We have developed a Windows-based interactive digital astrometry package with a simple, ergonomic interface, designed for the discovery, measurement, and recording of asteroid positions by individual observers. The software, "Tools For Astrometry", will handle FITS and SBIG format images up to 2048 x 2048 (or larger, depending on RAM), and provides features for blinking images or subframes of images, and measurement of positions and magnitudes against both the HST Guide Star Catalog and the USNO SA-1 catalog,. In addition, the program can calculate ephemerides from element tables, including the Lowell Asteroid Database available online, can generate charts of star-fields showing the motion of asteroids from the ephemeris superimposed against the background star field, can project motions of measured asteroids ahead several days using linear interpolation for purposes of reacquisition, and can calculate projected baselines for asteroid parallax measurements. Images, charts, and tables of ephemerides can printed as well as displayed, and reports can be generated in the standard format of the IAU Minor Planet Center. The software is designed ergonomically, and one can go from raw images to completed astrometric report in a matter of minutes. The software is an extension of software developed for introductory astronomy laboratories by Project CLEA, which is supported by grants from Gettysburg College and the National Science Foundation.

  14. Administrative Data Linkage as a Tool for Child Maltreatment Research

    ERIC Educational Resources Information Center

    Brownell, Marni D.; Jutte, Douglas P.

    2013-01-01

    Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…

  15. Ready Reference Tools: EBSCO Topic Search and SIRS Researcher.

    ERIC Educational Resources Information Center

    Goins, Sharon; Dayment, Lu

    1998-01-01

    Discussion of ready reference and current events collections in high school libraries focuses on a comparison of two CD-ROM services, EBSCO Topic Search and the SIRS Researcher. Considers licensing; access; search strategies; viewing articles; currency; printing; added value features; and advantages of CD-ROMs. (LRW)

  16. The Portable Usability Testing Lab: A Flexible Research Tool.

    ERIC Educational Resources Information Center

    Hale, Michael E.; And Others

    A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to any site…

  17. New research and tools lead to improved earthquake alerting protocols

    USGS Publications Warehouse

    Wald, David J.

    2009-01-01

    What’s the best way to get alerted about the occurrence and potential impact of an earthquake? The answer to that question has changed dramatically of late, in part due to improvements in earthquake science, and in part by the implementation of new research in the delivery of earthquake information

  18. TPACK: An Emerging Research and Development Tool for Teacher Educators

    ERIC Educational Resources Information Center

    Baran, Evrim; Chuang, Hsueh-Hua; Thompson, Ann

    2011-01-01

    TPACK (technological pedagogical content knowledge) has emerged as a clear and useful construct for researchers working to understand technology integration in learning and teaching. Whereas first generation TPACK work focused upon explaining and interpreting the construct, TPACK has now entered a second generation where the focus is upon using…

  19. Online Tools Allow Distant Students to Collaborate on Research Projects

    ERIC Educational Resources Information Center

    T.H.E. Journal, 2005

    2005-01-01

    The Wesleyan Academy and Moravian School in St. Thomas, Virgin Islands, recently joined forces with Evergreen Elementary in Fort Lewis, Wash., to collaborate on a research project using My eCoach Online (http://myecoach.com) as the primary medium to share information, post ideas and findings, and develop inquiry projects on 10 topics about water.…

  20. The National ALS Registry: A Recruitment Tool for Research

    PubMed Central

    Malek, Angela M.; Stickler, David E.; Antao, Vinicius C.; Horton, D. Kevin

    2014-01-01

    Introduction Subject recruitment is critical for understanding fatal diseases like ALS, however linking patients with researchers can be challenging. The US population-based National ALS Registry allows recruitment of persons with ALS (PALS) for research opportunities. Methods The Registry’s Research Notification Mechanism was used to recruit PALS aged ≥21 years; participants completed a web-based epidemiologic survey. PALS (n=2,232) were sent an email describing the study, and 268 surveys were completed. Results The mean age (± SD) of eligible participants was 57.7 ± 9.3 years for men and 61.5 ± 8.9 for women. Most were men (63%) and Caucasian (92%). Of 256 potentially eligible participants, 37.5% (n=96) returned an authorization to disclose protected health information. ALS was confirmed for 94% (83/88) from physician responses. Discussion This analysis demonstrates the National ALS Registry’s usefulness in recruiting PALS for research. This recruitment source can potentially foster the discovery of better treatment options and therapies, and of prevention strategies. PMID:25111654

  1. Reimagining Science Education and Pedagogical Tools: Blending Research with Teaching

    ERIC Educational Resources Information Center

    McLaughlin, Jacqueline S.

    2010-01-01

    The future of higher education in the sciences will be marked by programs that link skilled educators and research scientists from around the world with teachers for professional development and with students for high-impact learning--either virtually or physically in the field. These programs will use technology where possible to build new and…

  2. ACNP and NILDE: Essential Tools for Access to Scientific Research

    NASA Astrophysics Data System (ADS)

    Brunetti, F.; Bonora, O.; Filippucci, G.

    2015-04-01

    This paper describes ACNP and NILDE, the two main Italian cooperative systems for access to scientific information. Used by the Italian Astronomical Libraries (IAL), they are two essential channels to access information resources that are otherwise unreachable. At the same time, they allow IAL (Italian Astronomical Libraries) to share their very rich and unique holdings with other research and university libraries.

  3. Friending Adolescents on Social Networking Websites: A Feasible Research Tool

    PubMed Central

    Brockman, Libby N.; Christakis, Dimitri A.; Moreno, Megan A.

    2014-01-01

    Objective Social networking sites (SNSs) are increasingly used for research. This paper reports on two studies examining the feasibility of friending adolescents on SNSs for research purposes. Methods Study 1 took place on www.MySpace.com where public profiles belonging to 18-year-old adolescents received a friend request from an unknown physician. Study 2 took place on www.Facebook.com where college freshmen from two US universities, enrolled in an ongoing research study, received a friend request from a known researcher’s profile. Acceptance and retention rates of friend requests were calculated for both studies. Results Study 1: 127 participants received a friend request; participants were 18 years-old, 62.2% male and 51.8% Caucasian. 49.6% accepted the friend request. After 9 months, 76% maintained the online friendship, 12.7% defriended the study profile and 11% deactivated their profile. Study 2: 338 participants received a friend request; participants were 18 years-old, 56.5% female and 75.1% Caucasian. 99.7% accepted the friend request. Over 12 months, 3.3% defriended the study profile and 4.1% deactivated their profile. These actions were often temporary; the overall 12-month friendship retention rate was 96.1%. Conclusion Friending adolescents on SNSs is feasible and friending adolescents from a familiar profile may be more effective for maintaining online friendship with research participants over time. PMID:25485226

  4. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  5. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  6. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  7. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval.

  8. Developing a Research Tool to Gauge Student Metacognition

    NASA Astrophysics Data System (ADS)

    McInerny, Alistair; Boudreaux, Andrew; Rishal, Sepideh; Clare, Kelci

    2012-10-01

    Metacognition refers to the family of thought processes and skills used to evaluate and manage learning. A research and curriculum development project underway at Western Washington University uses introductory physics labs as a context to promote students' abilities to learn and apply metacognitive skills. A required ``narrative reflection'' has been incorporated as a weekly end-of-lab assignment. The goal of the narrative reflection is to encourage and support student metacognition while generating written artifacts that can be used by researchers to study metacognition in action. We have developed a Reflective Thinking Rubric (RTR) to analyze scanned narrative reflections. The RTR codes student writing for Metacognitive Elements, identifiable steps or aspects of metacognitive thinking at a variety of levels of sophistication. We hope to use the RTR to monitor the effect of weekly reflection on metacognitive ability and to search for correlations between metacognitive ability and conceptual understanding.

  9. CAMS as a tool for human factors research in spaceflight

    NASA Astrophysics Data System (ADS)

    Sauer, Juergen

    2004-01-01

    The paper reviews a number of research studies that were carried out with a PC-based task environment called Cabin Air Management System (CAMS) simulating the operation of a spacecraft's life support system. As CAMS was a multiple task environment, it allowed the measurement of performance at different levels. Four task components of different priority were embedded in the task environment: diagnosis and repair of system faults, maintaining atmospheric parameters in a safe state, acknowledgement of system alarms (reaction time), and keeping a record of critical system resources (prospective memory). Furthermore, the task environment permitted the examination of different task management strategies and changes in crew member state (fatigue, anxiety, mental effort). A major goal of the research programme was to examine how crew members adapted to various forms of sub-optimal working conditions, such as isolation and confinement, sleep deprivation and noise. None of the studies provided evidence for decrements in primary task performance. However, the results showed a number of adaptive responses of crew members to adjust to the different sub-optimal working conditions. There was evidence for adjustments in information sampling strategies (usually reductions in sampling frequency) as a result of unfavourable working conditions. The results also showed selected decrements in secondary task performance. Prospective memory seemed to be somewhat more vulnerable to sub-optimal working conditions than performance on the reaction time task. Finally, suggestions are made for future research with the CAMS environment.

  10. Modelling as an indispensible research tool in the information society.

    NASA Astrophysics Data System (ADS)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  11. Experimental and Analytical Research on Resonance Phenomena of Vibrating Head with MRE Regulating Element

    NASA Astrophysics Data System (ADS)

    Miedzińska, D.; Gieleta, R.; Osiński, J.

    2015-02-01

    A vibratory pile hammer (VPH) is a mechanical device used to drive steel piles as well as tube piles into soil to provide foundation support for buildings or other structures. In order to increase the stability and the efficiency of the VPH work in the over-resonance frequency, a new VPH construction was developed at the Military University of Technology. The new VPH contains a system of counter-rotating eccentric weights, powered by hydraulic motors, and designed in such a way that horizontal vibrations cancel out, while vertical vibrations are transmitted into the pile. This system is suspended in the static parts by the adaptive variable stiffness pillows based on a smart material, magnetorheological elastomer (MRE), whose rheological and mechanical properties can be reversibly and rapidly controlled by an external magnetic field. The work presented in the paper is a part of the modified VPH construction design process. It concerns the experimental research on the vibrations during the piling process and the analytical analyses of the gained signal. The results will be applied in the VPH control system.

  12. [Research Progress in Analytical Technology for Heavy Metals in Atmospheric Particles].

    PubMed

    Wang, Yu-jie; Tu, Zhen-quan; Zhou, Li; Chi, Yong-jie; Luo, Qin

    2015-04-01

    Atmospheric particles have become the primary atmospheric pollutions, of which the heavy metals, owing to non-degradability and hysteresis, a serious threat to human life and natural environment, have become a hot research issue currently. The analytical methods of heavy metals in atmospheric particles are summarized in the present review, including atomic absorption spectrometry, inductively coupled plasma atomic emission spectrometry, inductively coupled plasma mass spectrometry, neutron activation analysis, fluorescence spectrometry, glow discharge atomic emission spectrometry, microwave plasma atomic emission spectrometry, and laser induced breakdown spectroscopy, and some proposals are tried to make for improving the shortcomings of these technologies: continuum source Atomic absorption spectrometry for simultaneously measuring multi-elements, atomic emission spectrometry for direct determination of particulates, high resolution laser ablation inductively coupled plasma mass spectrometry for determination of solid samples, low scattering synchrotron fluorescence spectrum for determination of atmospheric particulate matter and k0 neutron activation analysis for determination of radioactive elements in the troposphere Analysis techniques of heavy metals in atmospheric particulate matter are promoted to develop toward being real-time, fast, low- detection-limit, direct-measurement and simple-operation due to the spatial and temporal distribution difference of the heavy metals in atmospheric particles and human requirement for improvement of ambient air quality as well as rapid development of modern instrument science and technology. PMID:26197596

  13. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research.

    PubMed

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people's pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan's implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model's membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises.

  14. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research

    PubMed Central

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people’s pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan’s implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model’s membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises. PMID:26981163

  15. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  16. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Safety, NEPA 101A) should be used to support the life safety equivalency evaluation. If fire modeling is... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and...

  17. Guiding Independence: Developing a Research Tool to Support Student Decision Making in Selecting Online Information Sources

    ERIC Educational Resources Information Center

    Baildon, Rindi; Baildon, Mark

    2008-01-01

    The development and use of a research tool to guide fourth-grade students' use of information sources during a research project is described in this article. Over a period of five weeks, 21 fourth-grade students in an international school in Singapore participated in a study investigating the extent to which the use of a "research resource guide"…

  18. Conceptualising the Use of Facebook in Ethnographic Research: As Tool, as Data and as Context

    ERIC Educational Resources Information Center

    Baker, Sally

    2013-01-01

    This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…

  19. Searching for New Directions: Developing MA Action Research Project as a Tool for Teaching

    ERIC Educational Resources Information Center

    Lee, Young Ah; Wang, Ye

    2012-01-01

    Action research has been recognized as a useful professional development tool for teaching, but for inservice teachers, conducting action research can be challenging. Their learning about action research can be influenced by social situations--whether in an MA (Master of Arts) program or other professional development. The purpose of this…

  20. The "Metaphorical Collage" as a Research Tool in the Field of Education

    ERIC Educational Resources Information Center

    Russo-Zimet, Gila

    2016-01-01

    The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…

  1. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    ERIC Educational Resources Information Center

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  2. [Scientific information systems: tools for measures of biomedical research impact].

    PubMed

    Navarrete Cortés, José; Banqueri Ozáez, Jesús

    2008-12-01

    The present article provides an analysis and description of the use of scientific information systems as instruments to measure and monitor results and investigative activity in biomedicine. Based on the current situation concerning the use and implementation of these systems, we offer a detailed description of the actors of these systems and propose a functional architecture for this class of software. In addition, the instruments that these types of systems offer for the measurement of the impact of the results of research are described in depth, as these instruments can help in decision making. Finally, a selection of national and international scientific information systems are listed and reviewed. PMID:19631827

  3. Digital storytelling: an innovative tool for practice, education, and research.

    PubMed

    Lal, Shalini; Donnelly, Catherine; Shin, Jennifer

    2015-01-01

    Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed. PMID:25338054

  4. NASA Global Hawk: A New Tool for Earth Science Research

    NASA Technical Reports Server (NTRS)

    Hall, Phill

    2009-01-01

    This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.

  5. Electromagnetic Levitation: A Useful Tool in Microgravity Research

    NASA Technical Reports Server (NTRS)

    Szekely, Julian; Schwartz, Elliot; Hyers, Robert

    1995-01-01

    Electromagnetic levitation is one area of the electromagnetic processing of materials that has uses for both fundamental research and practical applications. This technique was successfully used on the Space Shuttle Columbia during the Spacelab IML-2 mission in July 1994 as a platform for accurately measuring the surface tensions of liquid metals and alloys. In this article, we discuss the key transport phenomena associated with electromagnetic levitation, the fundamental relationships associated with thermophysical property measurement that can be made using this technique, reasons for working in microgravity, and some of the results obtained from the microgravity experiments.

  6. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  7. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  8. The NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi

    2012-01-01

    The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.

  9. CECILIA, a versatile research tool for cellular responses to gravity.

    PubMed

    Braucker, Richard; Machemer, Hans

    2002-01-01

    We describe a centrifuge designed and constructed according to current demands for a versatile instrument in cellular gravitational research, in particular protists (ciliates, flagellates). The instrument (called CECILIA, centrifuge for ciliates) is suited for videomonitoring, videorecording, and quantitative evaluation of data from large numbers of swimming cells in a ground-based laboratory or in a drop tower/drop shaft under microgravity conditions. The horizontal rotating platform holds up to six 8mm-camcorders and six chambers holding the experimental cells. Under hypergravity conditions (up to 15 g) chambers can be rotated about 2 axes to adjust the swimming space at right angles or parallel to the resulting gravity vector. Evaluations of cellular responses to central acceleration-- in the presence of gravitational 1 g--are used for extrapolation of cellular behaviour under hypogravity conditions. CECILIA is operated and monitored by computer using a custom-made software. Times and slopes of rising and decreasing acceleration, values and and quality of steady acceleration are supervised online. CECILIA can serve as an on-ground research instrument for precursor investigations of the behaviour of ciliates and flagellates under microgravity conditions such as long-term experiments in the International Space Station.

  10. Nucleic Acid Aptamers: Research Tools in Disease Diagnostics and Therapeutics

    PubMed Central

    Yadava, Pramod K.

    2014-01-01

    Aptamers are short sequences of nucleic acid (DNA or RNA) or peptide molecules which adopt a conformation and bind cognate ligands with high affinity and specificity in a manner akin to antibody-antigen interactions. It has been globally acknowledged that aptamers promise a plethora of diagnostic and therapeutic applications. Although use of nucleic acid aptamers as targeted therapeutics or mediators of targeted drug delivery is a relatively new avenue of research, one aptamer-based drug “Macugen” is FDA approved and a series of aptamer-based drugs are in clinical pipelines. The present review discusses the aspects of design, unique properties, applications, and development of different aptamers to aid in cancer diagnosis, prevention, and/or treatment under defined conditions. PMID:25050359

  11. Electrostatic Levitation: A Tool to Support Materials Research in Microgravity

    NASA Technical Reports Server (NTRS)

    Rogers, Jan; SanSoucie, Mike

    2012-01-01

    Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.

  12. Cell stretching devices as research tools: engineering and biological considerations.

    PubMed

    Kamble, Harshad; Barton, Matthew J; Jun, Myeongjun; Park, Sungsu; Nguyen, Nam-Trung

    2016-08-16

    Cells within the human body are subjected to continuous, cyclic mechanical strain caused by various organ functions, movement, and growth. Cells are well known to have the ability to sense and respond to mechanical stimuli. This process is referred to as mechanotransduction. A better understanding of mechanotransduction is of great interest to clinicians and scientists alike to improve clinical diagnosis and understanding of medical pathology. However, the complexity involved in in vivo biological systems creates a need for better in vitro technologies, which can closely mimic the cells' microenvironment using induced mechanical strain. This technology gap motivates the development of cell stretching devices for better understanding of the cell response to mechanical stimuli. This review focuses on the engineering and biological considerations for the development of such cell stretching devices. The paper discusses different types of stretching concepts, major design consideration and biological aspects of cell stretching and provides a perspective for future development in this research area. PMID:27440436

  13. The Evaluation of the ESEA Title VII Spanish/English Bilingual Education Program: Research Design and Analytic Approaches.

    ERIC Educational Resources Information Center

    Coles, Gary J.

    The research design and analytic approaches used in the national evaluation of the Elementary Secondary Education Act (ESEA) Title VII Spanish/English Bilingual Education Program are described. This study evaluated the entire program, as opposed to individual projects, and had four goals: to (1) determine the cognitive and affective impact of…

  14. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  15. Peak-bridges due to in-column analyte transformations as a new tool for establishing molecular connectivities by comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Filippi, Jean-Jacques; Cocolo, Nicolas; Meierhenrich, Uwe J

    2015-02-27

    Comprehensive two-dimensional gas chromatography-mass spectrometry (GC×GC-MS) has been shown to permit for the unprecedented chromatographic resolution of volatile analytes encompassing various families of organic compounds. However, peak identification based on retention time, two-dimensional mapping, and mass spectrometric fragmentation only, is not a straightforward task yet. The possibility to establish molecular links between constituents is of crucial importance to understand the overall chemistry of any sample, especially in natural extracts where biogenetically related isomeric structures are often abundant. We here present a new way of using GC×GC that allows searching for those molecular connectivities. Analytical investigations of essential oil constituents by means of GC×GC-MS permitted to observe in real time the thermally-induced transformations of various sesquiterpenic derivatives. These transformations generated a series of well-defined two-dimensional peak bridges within the 2D-chromatograms connecting parent and daughter molecules, thus permitting to build a clear scheme of structural relationship between the different constituents. GC×GC-MS appears here as a tool for investigating chromatographic phenomena and analyte transformations that could not be understood with conventional GC-MS only. PMID:25622519

  16. miRQuest: integration of tools on a Web server for microRNA research.

    PubMed

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-01-01

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/. PMID:27050998

  17. Advanced imaging microscope tools applied to microgravity research investigations

    NASA Astrophysics Data System (ADS)

    Peterson, L.; Samson, J.; Conrad, D.; Clark, K.

    1998-01-01

    The inability to observe and interact with experiments on orbit has been an impediment for both basic research and commercial ventures using the shuttle. In order to open the frontiers of space, the Center for Microgravity Automation Technology has developed a unique and innovative system for conducting experiments at a distance, the ``Remote Scientist.'' The Remote Scientist extends laboratory automation capability to the microgravity environment. While the Remote Scientist conceptually encompasses a broad spectrum of elements and functionalities, the development approach taken is to: • establish a baseline capability that is both flexible and versatile • incrementally augment the baseline with additional functions over time. Since last year, the application of the Remote Scientist has changed from protein crystal growth to tissue culture, specifically, the development of skeletal muscle under varying levels of tension. This system includes a series of bioreactor chambers that allow for three-dimensional growth of muscle tissue on a membrane suspended between the two ends of a programmable force transducer that can provide automated or investigator-initiated tension on the developing tissue. A microscope objective mounted on a translation carriage allows for high-resolution microscopy along a large area of the tissue. These images will be mosaiced on orbit to detect features and structures that span multiple images. The use of fluorescence and pseudo-confocal microscopy will maximize the observational capabilities of this system. A series of ground-based experiments have been performed to validate the bioreactor, the force transducer, the translation carriage and the image acquisition capabilities of the Remote Scientist. • The bioreactor is capable of sustaining three dimensional tissue culture growth over time. • The force transducer can be programmed to provide static tension on cells or to simulate either slow or fast growth of underlying tissues in

  18. Single-cell MALDI-MS as an analytical tool for studying intrapopulation metabolic heterogeneity of unicellular organisms.

    PubMed

    Amantonico, Andrea; Urban, Pawel L; Fagerer, Stephan R; Balabin, Roman M; Zenobi, Renato

    2010-09-01

    Heterogeneity is a characteristic feature of all populations of living organisms. Here we make an attempt to validate a single-cell mass spectrometric method for detection of changes in metabolite levels occurring in populations of unicellular organisms. Selected metabolites involved in central metabolism (ADP, ATP, GTP, and UDP-Glucose) could readily be detected in single cells of Closterium acerosum by means of negative-mode matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS). The analytical capabilities of this approach were characterized using standard compounds. The method was then used to study populations of individual cells with different levels of the chosen metabolites. With principal component analysis and support vector machine algorithms, it was possible to achieve a clear separation of individual C. acerosum cells in different metabolic states. This study demonstrates the suitability of mass spectrometric analysis of metabolites in single cells to measure cell-population heterogeneity.

  19. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  20. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  1. Citizen Science as a New Tool in Dog Cognition Research

    PubMed Central

    Stewart, Laughlin; MacLean, Evan L.; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W.; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology. PMID:26376443

  2. Citizen Science as a New Tool in Dog Cognition Research.

    PubMed

    Stewart, Laughlin; MacLean, Evan L; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

  3. Emerging Imaging Tools for Use with Traumatic Brain Injury Research

    PubMed Central

    Wilde, Elisabeth A.; Tong, Karen A.; Holshouser, Barbara A.

    2012-01-01

    Abstract This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children. PMID:21787167

  4. The ABCs of incentive-based treatment in health care: a behavior analytic framework to inform research and practice.

    PubMed

    Meredith, Steven E; Jarvis, Brantley P; Raiff, Bethany R; Rojewski, Alana M; Kurti, Allison; Cassidy, Rachel N; Erb, Philip; Sy, Jolene R; Dallery, Jesse

    2014-01-01

    Behavior plays an important role in health promotion. Exercise, smoking cessation, medication adherence, and other healthy behavior can help prevent, or even treat, some diseases. Consequently, interventions that promote healthy behavior have become increasingly common in health care settings. Many of these interventions award incentives contingent upon preventive health-related behavior. Incentive-based interventions vary considerably along several dimensions, including who is targeted in the intervention, which behavior is targeted, and what type of incentive is used. More research on the quantitative and qualitative features of many of these variables is still needed to inform treatment. However, extensive literature on basic and applied behavior analytic research is currently available to help guide the study and practice of incentive-based treatment in health care. In this integrated review, we discuss how behavior analytic research and theory can help treatment providers design and implement incentive-based interventions that promote healthy behavior. PMID:24672264

  5. The ABCs of incentive-based treatment in health care: a behavior analytic framework to inform research and practice

    PubMed Central

    Meredith, Steven E; Jarvis, Brantley P; Raiff, Bethany R; Rojewski, Alana M; Kurti, Allison; Cassidy, Rachel N; Erb, Philip; Sy, Jolene R; Dallery, Jesse

    2014-01-01

    Behavior plays an important role in health promotion. Exercise, smoking cessation, medication adherence, and other healthy behavior can help prevent, or even treat, some diseases. Consequently, interventions that promote healthy behavior have become increasingly common in health care settings. Many of these interventions award incentives contingent upon preventive health-related behavior. Incentive-based interventions vary considerably along several dimensions, including who is targeted in the intervention, which behavior is targeted, and what type of incentive is used. More research on the quantitative and qualitative features of many of these variables is still needed to inform treatment. However, extensive literature on basic and applied behavior analytic research is currently available to help guide the study and practice of incentive-based treatment in health care. In this integrated review, we discuss how behavior analytic research and theory can help treatment providers design and implement incentive-based interventions that promote healthy behavior. PMID:24672264

  6. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  7. In situ protein secondary structure determination in ice: Raman spectroscopy-based process analytical tool for frozen storage of biopharmaceuticals.

    PubMed

    Roessl, Ulrich; Leitgeb, Stefan; Pieters, Sigrid; De Beer, Thomas; Nidetzky, Bernd

    2014-08-01

    A Raman spectroscopy-based method for in situ monitoring of secondary structural composition of proteins during frozen and thawed storage was developed. A set of reference proteins with different α-helix and β-sheet compositions was used for calibration and validation in a chemometric approach. Reference secondary structures were quantified with circular dichroism spectroscopy in the liquid state. Partial least squares regression models were established that enable estimation of secondary structure content from Raman spectra. Quantitative secondary structure determination in ice was accomplished for the first time and correlation with existing (qualitative) protein structural data from the frozen state was achieved. The method can be used in the presence of common stabilizing agents and is applicable in an industrial freezer setup. Raman spectroscopy represents a powerful, noninvasive, and flexibly applicable tool for protein stability monitoring during frozen storage.

  8. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  9. Analytical electron microscopy and focused ion beam: complementary tool for the imaging of copper sorption onto iron oxide aggregates.

    PubMed

    Mavrocordatos, D; Steiner, M; Boller, M

    2003-04-01

    Nanometre-scale electron spectroscopic imaging has been applied to characterize the operation of a copper filtration plant in environmental science. Copper washed off from roofs and roads is considered to be a major contributor to diffuse copper pollution of urban environments. A special adsorber system has been suggested to control the diffusion of copper fluxes by retaining Cu with a granulated iron hydroxide. The adsorber was tested over an 18-month period on facade runoff. The concentrations range of Cu in the runoff water was measured between 10 and 1000 p.p.m. and could be reduced by between 96% and 99% in the adsorption ditch. Before the analysis of the adsorber, the suspended material from the inflow was ultracentrifuged onto TEM grids and analysed by energy-filtered transmission electron microscopy (EFTEM). Copper was found either as small precipitates 5-20 nm in size or adsorbed onto organic and inorganic particles. This Cu represents approximately 30% of the total dissolved Cu, measured by atomic emission spectrometry. To locate where the copper sorption takes place within the adsorber, the granulated iron oxide was analysed by analytical electron microscopy after exposure to the roof run-off water. A section of the granulated iron hydroxide was prepared by focused ion beam milling. The thickness of the lamina was reduced to 100 nm and analysed by EFTEM. The combination of these two techniques allowed us to observe the diffusion of Cu into the aggregate of Fe. Elemental maps of Fe and Cu revealed that copper was not only present at the surface of the granules but was also sorbed onto the fine particles inside the adsorber.

  10. Production Workers' Literacy and Numeracy Practices: Using Cultural-Historical Activity Theory (CHAT) as an Analytical Tool

    ERIC Educational Resources Information Center

    Yasukawa, Keiko; Brown, Tony; Black, Stephen

    2013-01-01

    Public policy discourses claim that there is a "crisis" in the literacy and numeracy levels of the Australian workforce. In this paper, we propose a methodology for examining this "crisis" from a critical perspective. We draw on findings from an ongoing research project by the authors which investigates production workers'…

  11. Research on efficient and stable milling using CNC small size tool

    NASA Astrophysics Data System (ADS)

    Luo, Yongxin; Zhao, Beichen; Long, Hua; Yu, Nanlin

    2011-05-01

    In order to mill efficiently and stably using small size tool on computer numerical control machine(CNC machine), the paper establishes dual-objective function on the basis of the minimum tool wear and the largest cutting efficiency. Meanwhile, the influence of diameter and length of tool suspended on stability is considered under the guidance of chatter stability analysis relationship. Research results show that, Pareto solution set which has two factors into account conflicting can be obtained by Genetic Algorithms, combined Pareto solution set with the frequency response function (FRF) chatter stability diagram, Pareto solutions of the smaller range of options, the milling parameters which meet the requirements of efficient and stable milling of CNC machine tools can be optimized conveniently and accurately. When the tool suspended length increases, the system stiffness decreases and the chatter stability domain graphic drops down, Stability region narrows.

  12. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. PMID:27429366

  13. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  14. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  15. Communication research between working capacity of hard- alloy cutting tools and fractal dimension of their wear

    NASA Astrophysics Data System (ADS)

    Arefiev, K.; Nesterenko, V.; Daneykina, N.

    2016-06-01

    The results of communication research between the wear resistance of the K applicability hard-alloy cutting tools and the fractal dimension of the wear surface, which is formed on a back side of the cutting edge when processing the materials showing high adhesive activity are presented in the paper. It has been established that the wear resistance of tested cutting tools samples increases according to a fractal dimension increase of their wear surface.

  16. A Tool for Measuring NASA's Aeronautics Research Progress Toward Planned Strategic Community Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.

  17. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools

    PubMed Central

    Blevins, Meridith; Wehbe, Firas H.; Rebeiro, Peter F.; Caro-Vega, Yanink; McGowan, Catherine C.; Shepherd, Bryan E.

    2016-01-01

    Objective To develop and disseminate tools for interactive visualization of HIV cohort data. Design and Methods If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language). The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP), and our implementation utilized Caribbean, Central and South America network (CCASAnet) data. Results This tool currently presents patient-level data in three classes of plots: (1) Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2) Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3) Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART) initiation, CD4 trajectories after ART initiation, and mortality. Conclusions We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community. PMID:26963255

  18. The Effectiveness of Bereavement Interventions with Children: A Meta-Analytic Review of Controlled Outcome Research

    ERIC Educational Resources Information Center

    Currier, Joseph M.; Holland, Jason M.; Neimeyer, Robert A.

    2007-01-01

    Grief therapies with children are becoming increasingly popular in the mental health community. Nonetheless, questions persist about how well these treatments actually help with children's adjustment to the death of a loved one. This study used meta-analytic techniques to evaluate the general effectiveness of bereavement interventions with…

  19. Taxometric and Factor Analytic Models of Anxiety Sensitivity: Integrating Approaches to Latent Structural Research

    ERIC Educational Resources Information Center

    Bernstein, Amit; Zvolensky, Michael J.; Norton, Peter J.; Schmidt, Norman B.; Taylor, Steven; Forsyth, John P.; Lewis, Sarah F.; Feldner, Matthew T.; Leen-Feldner, Ellen W.; Stewart, Sherry H.; Cox, Brian

    2007-01-01

    This study represents an effort to better understand the latent structure of anxiety sensitivity (AS), as indexed by the 16-item Anxiety Sensitivity Index (ASI; S. Reiss, R. A. Peterson, M. Gursky, & R. J. McNally, 1986), by using taxometric and factor-analytic approaches in an integrative manner. Taxometric analyses indicated that AS has a…

  20. Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993

    ERIC Educational Resources Information Center

    O'Driscoll, Gillian A.; Callahan, Brandy L.

    2008-01-01

    Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…

  1. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  2. ANALYTIC ELEMENT GROUND WATER MODELING AS A RESEARCH PROGRAM (1980-2006)

    EPA Science Inventory

    Scientists and engineers who use the analytic element method (AEM) for solving problems of regional ground water flow may be considered a community, and this community can be studied from the perspective of history and philosophy of science. Applying the methods of the Hungarian...

  3. About Skinner and Time: Behavior-Analytic Contributions to Research on Animal Timing

    ERIC Educational Resources Information Center

    Lejeune, Helga; Richelle, Marc; Wearden, J. H.

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in "The Behavior of Organisms," through the rate…

  4. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  5. Contribution of analytical microscopies to human neurodegenerative diseases research (PSP and AD).

    PubMed

    Quintana, Carmen

    2007-09-01

    Using analytical microscopies we have observed an increase of Fe(2+) iron-induced oxidative stress inside pathological ferritin (Ft). This finding, together with the presence of Ft in myelinated axons associated with oligodendrocyte processes and myelin sheet fraying, suggests that a dysfunction of ferritin (a ferritinopathy) may be the non-specific aging-dependent pathogenic event responsible for neurodegenerative disease.

  6. Challenges for Visual Analytics

    SciTech Connect

    Thomas, James J.; Kielman, Joseph

    2009-09-23

    Visual analytics has seen unprecedented growth in its first five years of mainstream existence. Great progress has been made in a short time, yet great challenges must be met in the next decade to provide new technologies that will be widely accepted by societies throughout the world. This paper sets the stage for some of those challenges in an effort to provide the stimulus for the research, both basic and applied, to address and exceed the envisioned potential for visual analytics technologies. We start with a brief summary of the initial challenges, followed by a discussion of the initial driving domains and applications, as well as additional applications and domains that have been a part of recent rapid expansion of visual analytics usage. We look at the common characteristics of several tools illustrating emerging visual analytics technologies, and conclude with the top ten challenges for the field of study. We encourage feedback and collaborative participation by members of the research community, the wide array of user communities, and private industry.

  7. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research

    PubMed Central

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices. PMID:26702217

  8. Scientific Mobility and International Research Networks: Trends and Policy Tools for Promoting Research Excellence and Capacity Building

    ERIC Educational Resources Information Center

    Jacob, Merle; Meek, V. Lynn

    2013-01-01

    One of the ways in which globalization is manifesting itself in higher education and research is through the increasing importance and emphasis on scientific mobility. This article seeks to provide an overview and analysis of current trends and policy tools for promoting mobility. The article argues that the mobility of scientific labour is an…

  9. [Analysis of researchers' implication in a research-intervention in the Stork Network: a tool for institutional analysis].

    PubMed

    Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles

    2016-01-01

    This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies. PMID:27653198

  10. Validation of the 4D NCAT simulation tools for use in high-resolution x-ray CT research

    NASA Astrophysics Data System (ADS)

    Segars, W. P.; Mahesh, Mahadevappa; Beck, T.; Frey, E. C.; Tsui, B. M. W.

    2005-04-01

    We validate the computer-based simulation tools developed in our laboratory for use in high-resolution CT research. The 4D NURBS-based cardiac-torso (NCAT) phantom was developed to provide a realistic and flexible model of the human anatomy and physiology. Unlike current phantoms in CT, the 4D NCAT has the advantage, due to its design, that its organ shapes can be changed to realistically model anatomical variations and patient motion. To efficiently simulate high-resolution CT images, we developed a unique analytic projection algorithm (including scatter and quantum noise) to accurately calculate projections directly from the surface definition of the phantom given parameters defining the CT scanner and geometry. The projection data are reconstructed into CT images using algorithms developed in our laboratory. The 4D NCAT phantom contains a level of detail that is close to impossible to produce in a physical test object. We, therefore, validate our CT simulation tools and methods through a series of direct comparisons with data obtained experimentally using existing, simple physical phantoms at different doses and using different x-ray energy spectra. In each case, the first-order simulations were found to produce comparable results (<12%). We reason that since the simulations produced equivalent results using simple test objects, they should be able to do the same in more anatomically realistic conditions. We conclude that, with the ability to provide realistic simulated CT image data close to that from actual patients, the simulation tools developed in this work will have applications in a broad range of CT imaging research.

  11. Operations Research.

    ERIC Educational Resources Information Center

    O'Neill, Edward T.

    1984-01-01

    Describes operations research as an important management tool that can aid library managers in effectively using available resources and as a set of analytical tools that can enable researchers to better understand library and information services. Early history, definition, models, applications to libraries, and impact are noted. Twenty-five…

  12. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. PMID:20424421

  13. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  14. Dietary MicroRNA Database (DMD): An Archive Database and Analytic Tool for Food-Borne microRNAs.

    PubMed

    Chiang, Kevin; Shu, Jiang; Zempleni, Janos; Cui, Juan

    2015-01-01

    With the advent of high throughput technology, a huge amount of microRNA information has been added to the growing body of knowledge for non-coding RNAs. Here we present the Dietary MicroRNA Databases (DMD), the first repository for archiving and analyzing the published and novel microRNAs discovered in dietary resources. Currently there are fifteen types of dietary species, such as apple, grape, cow milk, and cow fat, included in the database originating from 9 plant and 5 animal species. Annotation for each entry, a mature microRNA indexed as DM0000*, covers information of the mature sequences, genome locations, hairpin structures of parental pre-microRNAs, cross-species sequence comparison, disease relevance, and the experimentally validated gene targets. Furthermore, a few functional analyses including target prediction, pathway enrichment and gene network construction have been integrated into the system, which enable users to generate functional insights through viewing the functional pathways and building protein-protein interaction networks associated with each microRNA. Another unique feature of DMD is that it provides a feature generator where a total of 411 descriptive attributes can be calculated for any given microRNAs based on their sequences and structures. DMD would be particularly useful for research groups studying microRNA regulation from a nutrition point of view. The database can be accessed at http://sbbi.unl.edu/dmd/.

  15. Research-tool patents: issues for health in the developing world.

    PubMed Central

    Barton, John H.

    2002-01-01

    The patent system is now reaching into the tools of medical research, including gene sequences themselves. Many of the new patents can potentially preempt large areas of medical research and lay down legal barriers to the development of a broad category of products. Researchers must therefore consider redesigning their research to avoid use of patented techniques, or expending the effort to obtain licences from those who hold the patents. Even if total licence fees can be kept low, there are enormous negotiation costs, and one "hold-out" may be enough to lead to project cancellation. This is making it more difficult to conduct research within the developed world, and poses important questions for the future of medical research for the benefit of the developing world. Probably the most important implication for health in the developing world is the possible general slowing down and complication of medical research. To the extent that these patents do slow down research, they weaken the contribution of the global research community to the creation and application of medical technology for the benefit of developing nations. The patents may also complicate the granting of concessional prices to developing nations - for pharmaceutical firms that seek to offer a concessional price may have to negotiate arrangements with research-tool firms, which may lose royalties as a result. Three kinds of response are plausible. One is to develop a broad or global licence to permit the patented technologies to be used for important applications in the developing world. The second is to change technical patent law doctrines. Such changes could be implemented in developed and developing nations and could be quite helpful while remaining consistent with TRIPS. The third is to negotiate specific licence arrangements, under which specific research tools are used on an agreed basis for specific applications. These negotiations are difficult and expensive, requiring both scientific and

  16. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  17. Knowledge Translation Tools are Emerging to Move Neck Pain Research into Practice

    PubMed Central

    MacDermid, Joy C.; Miller, Jordan; Gross, Anita R.

    2013-01-01

    Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain—a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain. PMID:24155807

  18. Medical informatics: an essential tool for health sciences research in acute care.

    PubMed

    Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly

    2009-10-01

    Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.

  19. The Research-Teaching Nexus: Using a Construction Teaching Event as a Research Tool

    ERIC Educational Resources Information Center

    Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday

    2016-01-01

    In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…

  20. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  1. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    NASA Astrophysics Data System (ADS)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    The Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software framework to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. SEATREE is open source and community developed, distributed freely under the GNU General Public License. It is a fully contained package that lets users operate in a graphical mode, while giving more advanced users the opportunity to view and modify the source code. Top level graphical user interfaces which initiate the calculations and visualize results, are written in the Python programming language using an object-oriented, modern design. Results are plotted with either Matlab-like Python libraries, or SEATREE’s own Generic Mapping Tools wrapper. The underlying computational codes used to produce the results can be written in any programming language and accessed through Python wrappers. There are currently four fully developed science modules for SEATREE: (1) HC is a global geodynamics tool based on a semi-analytical mantle-circulation program based on work by B. Steinberger, Becker, and C. O'Neill. HC can compute velocities and tractions for global, spherical Stokes flow and radial viscosity variations. HC is fast enough to be used for classroom instruction, for example to let students interactively explore the role of radial viscosity variations for global geopotential (geoid) anomalies. (2) ConMan wraps Scott King’s 2D finite element mantle convection code, allowing users to quickly observe how modifications to input parameters affect heat flow over time. As seismology modules, SEATREE includes, (3), Larry, a global, surface wave phase-velocity inversion tool and, (4), Syn2D, a Cartesian tomography teaching tool for ray-theory wave propagation in synthetic, arbitrary velocity structure in the presence of noise. Both underlying programs were contributed by Boschi. Using Syn2D, students can explore, for example, how well a given

  2. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  3. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    ERIC Educational Resources Information Center

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  4. Family Myths, Beliefs, and Customs as a Research/Educational Tool to Explore Identity Formation

    ERIC Educational Resources Information Center

    Herman, William E.

    2008-01-01

    This paper outlines a qualitative research tool designed to explore personal identity formation as described by Erik Erikson and offers self-reflective and anonymous evaluative comments made by college students after completing this task. Subjects compiled a list of 200 myths, customs, fables, rituals, and beliefs from their family of origin and…

  5. Basic Reference Tools for Nursing Research. A Workbook with Explanations and Examples.

    ERIC Educational Resources Information Center

    Smalley, Topsy N.

    This workbook is designed to introduce nursing students to basic concepts and skills needed for searching the literatures of medicine, nursing, and allied health areas for materials relevant to specific information needs. The workbook introduces the following research tools: (1) the National Library of Medicine's MEDLINE searches, including a…

  6. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  7. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    ERIC Educational Resources Information Center

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  8. Creating the Tools for Multilingualism: A School-Based Action Research Project

    ERIC Educational Resources Information Center

    Martin, Cynthia

    2010-01-01

    This article reports on the small-scale evaluation of a school-based action research project, focusing on the creation of teaching materials aimed at developing tools for multilingualism for pupils aged 7-14. This three-year project was launched in September 2004 in 12 state primary schools and two secondary schools in two local authorities in the…

  9. Improving the Usefulness of Concept Maps as a Research Tool for Science Education

    ERIC Educational Resources Information Center

    Van Zele, Els; Lenaerts, Josephina; Wieme, Willem

    2004-01-01

    The search for authentic science research tools to evaluate student understanding in a hybrid learning environment with a large multimedia component has resulted in the use of concept maps as a representation of student's knowledge organization. One hundred and seventy third-semester introductory university-level engineering students represented…

  10. An experimental and analytical method for approximate determination of the tilt rotor research aircraft rotor/wing download

    NASA Technical Reports Server (NTRS)

    Jordon, D. E.; Patterson, W.; Sandlin, D. R.

    1985-01-01

    The XV-15 Tilt Rotor Research Aircraft download phenomenon was analyzed. This phenomenon is a direct result of the two rotor wakes impinging on the wing upper surface when the aircraft is in the hover configuration. For this study the analysis proceeded along tow lines. First was a method whereby results from actual hover tests of the XV-15 aircraft were combined with drag coefficient results from wind tunnel tests of a wing that was representative of the aircraft wing. Second, an analytical method was used that modeled that airflow caused gy the two rotors. Formulas were developed in such a way that acomputer program could be used to calculate the axial velocities were then used in conjunction with the aforementioned wind tunnel drag coefficinet results to produce download values. An attempt was made to validate the analytical results by modeling a model rotor system for which direct download values were determinrd..

  11. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation.

  12. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. PMID:26450973

  13. DMPwerkzeug - A tool to support the planning, implementation, and organization of research data management.

    NASA Astrophysics Data System (ADS)

    Klar, Jochen; Engelhardt, Claudia; Neuroth, Heike; Enke, Harry

    2016-04-01

    Following the call to make the results of publicly funded research openly accessible, more and more funding agencies demand the submission of a data management plan (DMP) as part of the application process. These documents specify, how the data management of the project is organized and what datasets will be published when. Of particular importance for European researchers is the Open Data Research Pilot of Horizon 2020 which requires data management plans for a set of 9 selected research fields from social sciences to nanotechnology. In order to assist the researchers creating these documents, several institutions developed dedicated software tools. The most well-known are DMPonline by the Digital Curation Centre (DCC) and DMPtool by the California Digital Library (CDL) - both extensive and well received web applications. The core functionality of these tools is the assisted editing of the DMP templates provided by the particular funding agency.While this is certainly helpful, especially in an environment with a plethora of different funding agencies like the UK or the USA, these tools are somewhat limited to this particular task and don't utilise the full potential of DMP. Beyond the purpose of fulfilling funder requirements, DMP can be useful for a number of additional tasks. In the initial conception phase of a project, they can be used as a planning tool to determine which date management activities and measures are necessary throughout the research process, to assess which resources are needed, and which institutions (computing centers, libraries, data centers) should be involved. During the project, they can act as a constant reference or guideline for the handling of research data. They also determine where the data will be stored after the project has ended and whether it can be accessed by the public, helping to take into account resulting requirements of the data center or actions necessary to ensure re-usability by others from early on. Ideally, a DMP

  14. The Second Life Researcher Toolkit - An Exploration of Inworld Tools, Methods and Approaches for Researching Educational Projects in Second Life

    NASA Astrophysics Data System (ADS)

    Moschini, Elena

    Academics are beginning to explore the educational potential of Second LifeTM (SL) by setting up inworld educational activities and projects. Given the relative novelty of the use of virtual world environments in higher education many such projects are still at pilot stage. However the initial pilot and experimentation stage will have to be followed by a rigorous evaluation process as for more traditional teaching projects. The chapter addresses issues about SL research tools and research methods. It introduces a "researcher toolkit" that includes: the various stages in the evaluation of SL educational projects and the theoretical framework that can inform such projects; an outline of the inworld tools that can be utilised or customised for academic research purposes; a review of methods for collecting feedback from participants and of the main ethical issues involved in researching virtual world environments; a discussion on the technical skills required to operate a research project in SL. The chapter also offers an indication of the inworld opportunities for the dissemination of SL research findings.

  15. Stone tool analysis and human origins research: some advice from Uncle Screwtape.

    PubMed

    Shea, John J

    2011-01-01

    The production of purposefully fractured stone tools with functional, sharp cutting edges is a uniquely derived hominin adaptation. In the long history of life on earth, only hominins have adopted this remarkably expedient and broadly effective technological strategy. In the paleontological record, flaked stone tools are irrefutable proof that hominins were present at a particular place and time. Flaked stone tools are found in contexts ranging from the Arctic to equatorial rainforests and on every continent except Antarctica. Paleolithic stone tools show complex patterns of variability, suggesting that they have been subject to the variable selective pressures that have shaped so many other aspects of hominin behavior and morphology. There is every reason to expect that insights gained from studying stone tools should provide vital and important information about the course of human evolution. And yet, one senses that archeological analyses of Paleolithic stone tools are not making as much of a contribution as they could to the major issues in human origins research.

  16. The Biobanking Analysis Resource Catalogue (BARCdb): a new research tool for the analysis of biobank samples.

    PubMed

    Galli, Joakim; Oelrich, Johan; Taussig, Michael J; Andreasson, Ulrika; Ortega-Paino, Eva; Landegren, Ulf

    2015-01-01

    We report the development of a new database of technology services and products for analysis of biobank samples in biomedical research. BARCdb, the Biobanking Analysis Resource Catalogue (http://www.barcdb.org), is a freely available web resource, listing expertise and molecular resource capabilities of research centres and biotechnology companies. The database is designed for researchers who require information on how to make best use of valuable biospecimens from biobanks and other sample collections, focusing on the choice of analytical techniques and the demands they make on the type of samples, pre-analytical sample preparation and amounts needed. BARCdb has been developed as part of the Swedish biobanking infrastructure (BBMRI.se), but now welcomes submissions from service providers throughout Europe. BARCdb can help match resource providers with potential users, stimulating transnational collaborations and ensuring compatibility of results from different labs. It can promote a more optimal use of European resources in general, both with respect to standard and more experimental technologies, as well as for valuable biobank samples. This article describes how information on service and reagent providers of relevant technologies is made available on BARCdb, and how this resource may contribute to strengthening biomedical research in academia and in the biotechnology and pharmaceutical industries. PMID:25336620

  17. The Biobanking Analysis Resource Catalogue (BARCdb): a new research tool for the analysis of biobank samples

    PubMed Central

    Galli, Joakim; Oelrich, Johan; Taussig, Michael J.; Andreasson, Ulrika; Ortega-Paino, Eva; Landegren, Ulf

    2015-01-01

    We report the development of a new database of technology services and products for analysis of biobank samples in biomedical research. BARCdb, the Biobanking Analysis Resource Catalogue (http://www.barcdb.org), is a freely available web resource, listing expertise and molecular resource capabilities of research centres and biotechnology companies. The database is designed for researchers who require information on how to make best use of valuable biospecimens from biobanks and other sample collections, focusing on the choice of analytical techniques and the demands they make on the type of samples, pre-analytical sample preparation and amounts needed. BARCdb has been developed as part of the Swedish biobanking infrastructure (BBMRI.se), but now welcomes submissions from service providers throughout Europe. BARCdb can help match resource providers with potential users, stimulating transnational collaborations and ensuring compatibility of results from different labs. It can promote a more optimal use of European resources in general, both with respect to standard and more experimental technologies, as well as for valuable biobank samples. This article describes how information on service and reagent providers of relevant technologies is made available on BARCdb, and how this resource may contribute to strengthening biomedical research in academia and in the biotechnology and pharmaceutical industries. PMID:25336620

  18. The Biobanking Analysis Resource Catalogue (BARCdb): a new research tool for the analysis of biobank samples.

    PubMed

    Galli, Joakim; Oelrich, Johan; Taussig, Michael J; Andreasson, Ulrika; Ortega-Paino, Eva; Landegren, Ulf

    2015-01-01

    We report the development of a new database of technology services and products for analysis of biobank samples in biomedical research. BARCdb, the Biobanking Analysis Resource Catalogue (http://www.barcdb.org), is a freely available web resource, listing expertise and molecular resource capabilities of research centres and biotechnology companies. The database is designed for researchers who require information on how to make best use of valuable biospecimens from biobanks and other sample collections, focusing on the choice of analytical techniques and the demands they make on the type of samples, pre-analytical sample preparation and amounts needed. BARCdb has been developed as part of the Swedish biobanking infrastructure (BBMRI.se), but now welcomes submissions from service providers throughout Europe. BARCdb can help match resource providers with potential users, stimulating transnational collaborations and ensuring compatibility of results from different labs. It can promote a more optimal use of European resources in general, both with respect to standard and more experimental technologies, as well as for valuable biobank samples. This article describes how information on service and reagent providers of relevant technologies is made available on BARCdb, and how this resource may contribute to strengthening biomedical research in academia and in the biotechnology and pharmaceutical industries.

  19. Building genetic tools in Drosophila research: an interview with Gerald Rubin

    PubMed Central

    2016-01-01

    Gerald (Gerry) Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms. PMID:27053132

  20. The Association of Religion Data Archives (ARDA): Online Research Data, Tools, and References

    PubMed Central

    Finke, Roger; Adamczyk, Amy

    2014-01-01

    The Association of Religion Data Archives (ARDA) currently archives over 400 local, national, and international data files, and offers a wide range of research tools to build surveys, preview data on-line, develop customized maps and reports of U.S. church membership, and examine religion differences across nations and regions of the world. The ARDA also supports reference and teaching tools that draw on the rich data archive. This research note offers a brief introduction to the quantitative data available for exploration or download, and a few of the website features most useful for research and teaching. Supported by the Lilly Endowment, the John Templeton Foundation, the Pennsylvania State University, and the Baylor Institute for Studies of Religion, all data downloads and online services are free of charge. PMID:25484914

  1. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    NASA Technical Reports Server (NTRS)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  2. The Scottish Government's Rural and Environmental Science and Analytical Services Strategic Research Progamme

    NASA Astrophysics Data System (ADS)

    Dawson, Lorna; Bestwick, Charles

    2013-04-01

    The Strategic Research Programme focuses on the delivery of outputs and outcomes within the major policy agenda areas of climate change, land use and food security, and to impact on the 'Wealthier', 'Healthier' and 'Greener' strategic objectives of the Scottish Government. The research is delivered through two programmes: 'Environmental Change' and 'Food, Land and People'; the core strength of which is the collaboration between the Scottish Government's Main Research Providers-The James Hutton Institute, the Moredun Research Institute, Rowett Institute of Nutrition and Health University of Aberdeen, Scotland's Rural College, Biomathematics and Statistics Scotland and The Royal Botanic Gardens Edinburgh. The research actively seeks to inform and be informed by stakeholders from policy, farming, land use, water and energy supply, food production and manufacturing, non-governmental organisations, voluntary organisations, community groups and general public. This presentation will provide an overview of the programme's interdisciplinary research, through examples from across the programme's themes. Examples will exemplify impact within the Strategic Programme's priorities of supporting policy and practice, contributing to economic growth and innovation, enhancing collaborative and multidisciplinary research, growing scientific resilience and delivering scientific excellence. http://www.scotland.gov.uk/Topics/Research/About/EBAR/StrategicResearch/future-research-strategy/Themes/ http://www.knowledgescotland.org/news.php?article_id=295

  3. "Blogs" and "wikis" are valuable software tools for communication within research groups.

    PubMed

    Sauer, Igor M; Bialek, Dominik; Efimova, Ekaterina; Schwartlander, Ruth; Pless, Gesine; Neuhaus, Peter

    2005-01-01

    Appropriate software tools may improve communication and ease access to knowledge for research groups. A weblog is a website which contains periodic, chronologically ordered posts on a common webpage, whereas a wiki is hypertext-based collaborative software that enables documents to be authored collectively using a web browser. Although not primarily intended for use as an intranet-based collaborative knowledge warehouse, both blogs and wikis have the potential to offer all the features of complex and expensive IT solutions. These tools enable the team members to share knowledge simply and quickly-the collective knowledge base of the group can be efficiently managed and navigated.

  4. A Query Tool Enabling Clinicians and Researchers to Explore Patient Cohorts.

    PubMed

    Lim Choi Keung, Sarah N; Khan, Omar; Asadipour, Ali; Dereli, Huseyin; Zhao, Lei; Robbins, Tim; Arvanitis, Theodoros N

    2015-01-01

    Due to the increasing amount of health information being gathered and the potential benefit of data reuse, it is now becoming a necessity for tools, which collect and analyse this data, to support integration of heterogeneous datasets, as well as provide intuitive user interfaces, which allow clinicians and researchers to query the data without needing to form complex SQL queries. The West Midlands Query Tool consists of an easy-to-use graph-based GUI, which interacts with a flexible middleware application. It has the main objective of querying heterogeneous data sources for exploring patient cohorts through a query builder and criteria set. PMID:26152952

  5. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  6. Flow-through cross-polarized imaging as a new tool to overcome the analytical sensitivity challenges of a low-dose crystalline compound in a lipid matrix.

    PubMed

    Adler, Camille; Schönenberger, Monica; Teleki, Alexandra; Leuenberger, Bruno; Kuentz, Martin

    2015-11-10

    Assessing the physical state of a low-dose active compound in a solid lipid or polymer matrix is analytically challenging, especially if the matrix exhibits some crystallinity. The aim of this study was first to compare the ability of current methods to detect the presence of a crystalline model compound in lipid matrices. Subsequently, a new technique was introduced and evaluated because of sensitivity issues that were encountered with current methods. The new technique is a flow-through version of cross-polarized imaging in transmission mode. The tested lipid-based solid dispersions (SDs) consisted of β-carotene (BC) as a model compound, and of Gelucire 50/13 or Geleol mono- and diglycerides as lipid matrices. The solid dispersions were analyzed by (hyper) differential scanning calorimetry (DSC), X-ray powder diffraction (XRPD), and microscopic techniques including atomic force microscopy (AFM). DSC and XRPD could analyze crystalline BC at concentrations as low as 3% (w/w) in the formulations. However, with microscopic techniques crystalline particles were detected at significantly lower concentrations of even 0.5% (w/w) BC. A flow-through cross-polarized imaging technique was introduced that combines the advantage of analyzing a larger sample size with high sensitivity of microscopy. Crystals were detected easily in samples containing even less than 0.2% (w/w) BC. Moreover, the new tool enabled approximation of the kinetic BC solubility in the crystalline lipid matrices. As a conclusion, the flow-through cross-polarized imaging technique has the potential to become an indispensable tool for characterizing low-dose crystalline compounds in a lipid or polymer matrix of solid dispersions.

  7. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  8. Analytics for Metabolic Engineering.

    PubMed

    Petzold, Christopher J; Chan, Leanne Jade G; Nhan, Melissa; Adams, Paul D

    2015-01-01

    Realizing the promise of metabolic engineering has been slowed by challenges related to moving beyond proof-of-concept examples to robust and economically viable systems. Key to advancing metabolic engineering beyond trial-and-error research is access to parts with well-defined performance metrics that can be readily applied in vastly different contexts with predictable effects. As the field now stands, research depends greatly on analytical tools that assay target molecules, transcripts, proteins, and metabolites across different hosts and pathways. Screening technologies yield specific information for many thousands of strain variants, while deep omics analysis provides a systems-level view of the cell factory. Efforts focused on a combination of these analyses yield quantitative information of dynamic processes between parts and the host chassis that drive the next engineering steps. Overall, the data generated from these types of assays aid better decision-making at the design and strain construction stages to speed progress in metabolic engineering research.

  9. Integrated Decision-Making Tool to Develop Spent Fuel Strategies for Research Reactors

    SciTech Connect

    Beatty, Randy L; Harrison, Thomas J

    2016-01-01

    IAEA Member States operating or having previously operated a Research Reactor are responsible for the safe and sustainable management and disposal of associated radioactive waste, including research reactor spent nuclear fuel (RRSNF). This includes the safe disposal of RRSNF or the corresponding equivalent waste returned after spent fuel reprocessing. One key challenge to developing general recommendations lies in the diversity of spent fuel types, locations and national/regional circumstances rather than mass or volume alone. This is especially true given that RRSNF inventories are relatively small, and research reactors are rarely operated at a high power level or duration typical of commercial power plants. Presently, many countries lack an effective long-term policy for managing RRSNF. This paper presents results of the International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) #T33001 on Options and Technologies for Managing the Back End of the Research Reactor Nuclear Fuel Cycle which includes an Integrated Decision Making Tool called BRIDE (Back-end Research reactor Integrated Decision Evaluation). This is a multi-attribute decision-making tool that combines the Total Estimated Cost of each life-cycle scenario with Non-economic factors such as public acceptance, technical maturity etc and ranks optional back-end scenarios specific to member states situations in order to develop a specific member state strategic plan with a preferred or recommended option for managing spent fuel from Research Reactors.

  10. About Skinner and time: behavior-analytic contributions to research on animal timing.

    PubMed

    Lejeune, Helga; Richelle, Marc; Wearden, J H

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in The Behavior of Organisms, through the rate differentiation procedures of Schedules of Reinforcement, to modern temporal psychophysics in animals. The second influence has been the development of accounts of animal timing that have tried to avoid reference to internal processes of a cognitive sort, in particular internal clock mechanisms. Skinner's early discussion of temporal control is first reviewed, and then three recent theories-Killeen & Fetterman's (1988) Behavioral Theory of Timing; Machado's (1997) Learning to Time; and Dragoi, Staddon, Palmer, & Buhusi's (2003) Adaptive Timer Model-are discussed and evaluated. PMID:16602380

  11. Conceptual framework for outcomes research studies of hepatitis C: an analytical review.

    PubMed

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  12. About Skinner and time: behavior-analytic contributions to research on animal timing.

    PubMed

    Lejeune, Helga; Richelle, Marc; Wearden, J H

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in The Behavior of Organisms, through the rate differentiation procedures of Schedules of Reinforcement, to modern temporal psychophysics in animals. The second influence has been the development of accounts of animal timing that have tried to avoid reference to internal processes of a cognitive sort, in particular internal clock mechanisms. Skinner's early discussion of temporal control is first reviewed, and then three recent theories-Killeen & Fetterman's (1988) Behavioral Theory of Timing; Machado's (1997) Learning to Time; and Dragoi, Staddon, Palmer, & Buhusi's (2003) Adaptive Timer Model-are discussed and evaluated.

  13. About Skinner and Time: Behavior-Analytic Contributions to Research on Animal Timing

    PubMed Central

    Lejeune, Helga; Richelle, Marc; Wearden, J.H

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in The Behavior of Organisms, through the rate differentiation procedures of Schedules of Reinforcement, to modern temporal psychophysics in animals. The second influence has been the development of accounts of animal timing that have tried to avoid reference to internal processes of a cognitive sort, in particular internal clock mechanisms. Skinner's early discussion of temporal control is first reviewed, and then three recent theories—Killeen & Fetterman's (1988) Behavioral Theory of Timing; Machado's (1997) Learning to Time; and Dragoi, Staddon, Palmer, & Buhusi's (2003) Adaptive Timer Model—are discussed and evaluated. PMID:16602380

  14. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  15. Comparing probability and non-probability sampling methods in Ecstasy research: implications for the internet as a research tool.

    PubMed

    Miller, Peter G; Johnston, Jennifer; Dunn, Matthew; Fry, Craig L; Degenhardt, Louisa

    2010-02-01

    The usage of Ecstasy and related drug (ERD) has increasingly been the focus of epidemiological and other public health-related research. One of the more promising methods is the use of the Internet as a recruitment and survey tool. However, there remain methodological concerns and questions about representativeness. Three samples of ERD users in Melbourne, Australia surveyed in 2004 are compared in terms of a number of key demographic and drug use variables. The Internet, face-to-face, and probability sampling methods appear to access similar but not identical groups of ERD users. Implications and limitations of the study are noted and future research is recommended.

  16. A Research Analytics Framework-Supported Recommendation Approach for Supervisor Selection

    ERIC Educational Resources Information Center

    Zhang, Mingyu; Ma, Jian; Liu, Zhiying; Sun, Jianshan; Silva, Thushari

    2016-01-01

    Identifying a suitable supervisor for a new research student is vitally important for his or her academic career. Current information overload and information disorientation have posed significant challenges for new students. Existing research for supervisor identification focuses on quality assessment of candidates, but ignores indirect relevance…

  17. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool. PMID:27435199

  18. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.

  19. The need for novel informatics tools for integrating and planning research in molecular and cellular cognition

    PubMed Central

    Müller, Klaus-Robert

    2015-01-01

    The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other biology fields. Additionally, the multilevel integration process characteristic of this field involves the establishment of experimental connections between molecular, electrophysiological, behavioral, and even cognitive data. This multidisciplinary integration process requires strategies and approaches that originate in several different fields, which greatly increases the complexity and demands of this process. Although causal assertions, where phenomenon A is thought to contribute or relate to B, are at the center of this integration process and key to research in biology, there are currently no tools to help scientists keep track of the increasingly more complex network of causal connections they use when making research decisions. Here, we propose the development of semiautomated graphical and interactive tools to help neuroscientists and other biologists, including those working in molecular and cellular cognition, to track, map, and weight causal evidence in research papers. There is a great need for a concerted effort by biologists, computer scientists, and funding institutions to develop maps of causal information that would aid in integration of research findings and in experiment planning. PMID:26286658

  20. A new research tool for hybrid Bayesian networks using script language

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Park, Cheol Young; Carvalho, Rommel

    2011-06-01

    While continuous variables become more and more inevitable in Bayesian networks for modeling real-life applications in complex systems, there are not much software tools to support it. Popular commercial Bayesian network tools such as Hugin, and Netica etc., are either expensive or have to discretize continuous variables. In addition, some free programs existing in the literature, commonly known as BNT, GeNie/SMILE, etc, have their own advantages and disadvantages respectively. In this paper, we introduce a newly developed Java tool for model construction and inference for hybrid Bayesian networks. Via the representation power of the script language, this tool can build the hybrid model automatically based on a well defined string that follows the specific grammars. Furthermore, it implements several inference algorithms capable to accommodate hybrid Bayesian networks, including Junction Tree algorithm (JT) for conditional linear Gaussian model (CLG), and Direct Message Passing (DMP) for general hybrid Bayesian networks with CLG structure. We believe this tool will be useful for researchers in the field.

  1. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  2. Technology insight: tools for research, diagnosis and clinical assessment of treatment in idiopathic inflammatory myopathies.

    PubMed

    Lundberg, Ingrid E; Alexanderson, Helene

    2007-05-01

    Idiopathic inflammatory myopathies, known collectively as myositis, are chronic diseases that cause disability, mainly from muscle weakness, despite the use of immunosuppressive therapies. An improved outcome requires increased knowledge of the key molecular pathways that cause symptoms in muscles and other organs. Technological advances offer promise for improving our understanding of disease mechanisms, and some tools will be helpful in diagnosis and the assessment of therapeutic success. The application of new tools depends on their validation in longitudinal studies using clinical outcome measures combined with assessments of molecular events in affected organs. Clinical outcome measures and definitions of improvement have been developed and validated through the International Myositis Assessment and Clinical Studies collaboration. Some imaging techniques, such as MRI and magnetic resonance spectroscopy of muscles, and high-resolution CT of lungs, can assess changes in local inflammatory activity, among many other aspects of pathology. Changes in protein and gene expression patterns in repeated biopsies from affected organs (muscle, skin and lungs) provide molecular information and allow increasingly precise disease classifications and therapeutic evaluation, but are to date only research tools. This Review focuses on advances in diagnostic and outcome tools and their roles in clinical practice and clinical research in patients with polymyositis and dermatomyositis.

  3. Sport and exercise psychology research and Olympic success: an analytical and correlational investigation.

    PubMed

    Szabo, Attila

    2014-01-01

    The aim of the current inquiry was to identify the national origin of scholars who lead the work in the area of Sport and Exercise Psychology, and to examine whether their research output is connected to the Olympic success of their national athletes. Consequently, the two specialised journals with the highest impact factors in this field were examined for the origin of publications throughout 11 years for authors' national affiliations. Subsequently, the link between national research output and Olympic medals was examined. The results revealed that over 50% of the publications originate from Canada, the U.K. and the U.S.A. National research output in Sport and Exercise Psychology was correlated with the number of Olympic medals; the proportion of shared variance was 42% and 57%, respectively, in the two journals. Nevertheless, it is posited that the observed link is primarily due to other factors that ought to be examined in future research.

  4. A New Tool for Identifying Research Standards and Evaluating Research Performance

    ERIC Educational Resources Information Center

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  5. Using Digital Video as a Research Tool: Ethical Issues for Researchers

    ERIC Educational Resources Information Center

    Schuck, Sandy; Kearney, Matthew

    2006-01-01

    Digital video and accompanying editing software are increasingly becoming more accessible for researchers in terms of ease of use and cost. The rich, visually appealing and seductive nature of video-based data can convey a strong sense of direct experience with the phenomena studied (Pea, 1999). However, the ease of selection and editing of…

  6. CRISPR: a versatile tool for both forward and reverse genetics research.

    PubMed

    Gurumurthy, Channabasavaiah B; Grati, M'hamed; Ohtsuka, Masato; Schilit, Samantha L P; Quadros, Rolen M; Liu, Xue Zhong

    2016-09-01

    Human genetics research employs the two opposing approaches of forward and reverse genetics. While forward genetics identifies and links a mutation to an observed disease etiology, reverse genetics induces mutations in model organisms to study their role in disease. In most cases, causality for mutations identified by forward genetics is confirmed by reverse genetics through the development of genetically engineered animal models and an assessment of whether the model can recapitulate the disease. While many technological advances have helped improve these approaches, some gaps still remain. CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated), which has emerged as a revolutionary genetic engineering tool, holds great promise for closing such gaps. By combining the benefits of forward and reverse genetics, it has dramatically expedited human genetics research. We provide a perspective on the power of CRISPR-based forward and reverse genetics tools in human genetics and discuss its applications using some disease examples. PMID:27384229

  7. CRISPR: a versatile tool for both forward and reverse genetics research.

    PubMed

    Gurumurthy, Channabasavaiah B; Grati, M'hamed; Ohtsuka, Masato; Schilit, Samantha L P; Quadros, Rolen M; Liu, Xue Zhong

    2016-09-01

    Human genetics research employs the two opposing approaches of forward and reverse genetics. While forward genetics identifies and links a mutation to an observed disease etiology, reverse genetics induces mutations in model organisms to study their role in disease. In most cases, causality for mutations identified by forward genetics is confirmed by reverse genetics through the development of genetically engineered animal models and an assessment of whether the model can recapitulate the disease. While many technological advances have helped improve these approaches, some gaps still remain. CRISPR/Cas (clustered regularly interspaced short palindromic repeats/CRISPR-associated), which has emerged as a revolutionary genetic engineering tool, holds great promise for closing such gaps. By combining the benefits of forward and reverse genetics, it has dramatically expedited human genetics research. We provide a perspective on the power of CRISPR-based forward and reverse genetics tools in human genetics and discuss its applications using some disease examples.

  8. Process of formulating USDA's Expanded Flavonoid Database for the Assessment of Dietary intakes: a new tool for epidemiological research.

    PubMed

    Bhagwat, Seema A; Haytowitz, David B; Wasswa-Kintu, Shirley I; Pehrsson, Pamela R

    2015-08-14

    The scientific community continues to be interested in potential links between flavonoid intakes and beneficial health effects associated with certain chronic diseases such as CVD, some cancers and type 2 diabetes. Three separate flavonoid databases (Flavonoids, Isoflavones and Proanthocyanidins) developed by the USDA Agricultural Research Service since 1999 with frequent updates have been used to estimate dietary flavonoid intakes, and investigate their health effects. However, each of these databases contains only a limited number of foods. The USDA has constructed a new Expanded Flavonoids Database for approximately 2900 commonly consumed foods, using analytical values from their existing flavonoid databases (Flavonoid Release 3.1 and Isoflavone Release 2.0) as the foundation to calculate values for all the twenty-nine flavonoid compounds included in these two databases. Thus, the new database provides full flavonoid profiles for twenty-nine predominant dietary flavonoid compounds for every food in the database. Original analytical values in Flavonoid Release 3.1 and Isoflavone Release 2.0 for corresponding foods were retained in the newly constructed database. Proanthocyanidins are not included in the expanded database. The process of formulating the new database includes various calculation techniques. This article describes the process of populating values for the twenty-nine flavonoid compounds for every food in the dataset, along with challenges encountered and resolutions suggested. The new expanded flavonoid database released on the Nutrient Data Laboratory's website would provide uniformity in estimations of flavonoid content in foods and will be a valuable tool for epidemiological studies to assess dietary intakes.

  9. Open Virtual Worlds as Pedagogical Research Tools: Learning from the Schome Park Programme

    NASA Astrophysics Data System (ADS)

    Twining, Peter; Peachey, Anna

    This paper introduces the term Open Virtual Worlds and argues that they are ‘unclaimed educational spaces’, which provide a valuable tool for researching pedagogy. Having explored these claims the way in which Teen Second Life® virtual world was used for pedagogical experimentation in the initial phases of the Schome Park Programme is described. Four sets of pedagogical dimensions that emerged are presented and illustrated with examples from the Schome Park Programme.

  10. Research on analytical model and design formulas of permanent magnetic bearings based on Halbach array with arbitrary segmented magnetized angle

    NASA Astrophysics Data System (ADS)

    Wang, Nianxian; Wang, Dongxiong; Chen, Kuisheng; Wu, Huachun

    2016-07-01

    The bearing capacity of permanent magnetic bearings can be improved efficiently by using the Halbach array magnetization. However, the research on analytical model of Halbach array PMBs with arbitrary segmented magnetized angle has not been developed. The application of Halbach array PMBs has been limited by the absence of the analytical model and design formulas. In this research, the Halbach array PMBs with arbitrary segmented magnetized angle has been studied. The magnetization model of bearings is established. The magnetic field distribution model of the permanent magnet array is established by using the scalar magnetic potential model. On the basis of this, the bearing force model and the bearing stiffness model of the PMBs are established based on the virtual displacement method. The influence of the pair of magnetic rings in one cycle and the structure parameters of PMBs on the maximal bearing capacity and support stiffness characteristics are studied. The reference factors for the design process of PMBs have been given. Finally, the theoretical model and the conclusions are verified by the finite element analysis.

  11. DataUp: A tool to help researchers describe and share tabular data

    PubMed Central

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012. PMID:25653834

  12. DataUp: A tool to help researchers describe and share tabular data.

    PubMed

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012.

  13. [Research and application progress of near infrared spectroscopy analytical technology in China in the past five years].

    PubMed

    Chu, Xiao-Li; Lu, Wan-Zhen

    2014-10-01

    In the past decade, near infrared spectroscopy (NIR) has expanded rapidly and been applied widely in many fields in China. The recent progress of the research and application of NIR analytical technology in China especially in the past five years has been reviewed. It includes hardware and software R&D, Chemometric algorithms and experimental methods research, and quantitative and qualitative applications in the typical fields such as food, agriculture, pharmaceuticals, petrochemicals, forestry, and medical diagnosis. 209 references are cited, which are mainly published in national journals, professional magazines, and book chapters. The developing trend of near infrared spectroscopy and the strategies to further promote its innovation and development in China in the near future are put forward and discussed. PMID:25739193

  14. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement

    PubMed Central

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide ‘data warehouses’ to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with ‘isolation’ orders, or to determine patients’ eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time ‘pick off’ tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring. PMID:24287172

  15. CCMC: Serving research and space weather communities with unique space weather services, innovative tools and resources

    NASA Astrophysics Data System (ADS)

    Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti; Maddox, Marlo

    2015-04-01

    With the addition of Space Weather Research Center (a sub-team within CCMC) in 2010 to address NASA’s own space weather needs, CCMC has become a unique entity that not only facilitates research through providing access to the state-of-the-art space science and space weather models, but also plays a critical role in providing unique space weather services to NASA robotic missions, developing innovative tools and transitioning research to operations via user feedback. With scientists, forecasters and software developers working together within one team, through close and direct connection with space weather customers and trusted relationship with model developers, CCMC is flexible, nimble and effective to meet customer needs. In this presentation, we highlight a few unique aspects of CCMC/SWRC’s space weather services, such as addressing space weather throughout the solar system, pushing the frontier of space weather forecasting via the ensemble approach, providing direct personnel and tool support for spacecraft anomaly resolution, prompting development of multi-purpose tools and knowledge bases, and educating and engaging the next generation of space weather scientists.

  16. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification

    NASA Astrophysics Data System (ADS)

    Bates, Matthew E.; Keisler, Jeffrey M.; Zussblatt, Niels P.; Plourde, Kenton J.; Wender, Ben A.; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis—methods commonly applied in financial and operations management—to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios—combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

  17. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification.

    PubMed

    Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

  18. The virtual supermarket: An innovative research tool to study consumer food purchasing behaviour

    PubMed Central

    2011-01-01

    Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The

  19. Metaphors and Drawings as Research Tools of Head Teachers' Perceptions on Their Management and Leadership Roles and Responsibilities

    ERIC Educational Resources Information Center

    Argyropoulou, Eleftheria; Hatira, Kalliopi

    2014-01-01

    This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…

  20. Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom

    PubMed Central

    Council, Sarah E.; Horvath, Julie E.

    2016-01-01

    The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects. PMID:27047587