Science.gov

Sample records for analytical tool research

  1. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    PubMed

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  2. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  3. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  4. Narrative health research: exploring big and small stories as analytical tools.

    PubMed

    Sools, Anneke

    2013-01-01

    In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that conceptualizes health as a multidimensional, positive, dynamic and morally dilemmatic yet meaningful practice. I compare big and small stories as analytical tools to explore what narrative has to offer to address, nuance and complicate five challenges in narrative health research: (1) the interplay between health and other life issues; (2) the taken-for-granted yet rare character of the experience of good health; (3) coherence or incoherence as norms for good health; (4) temporal issues; (5) health as moral practice. In this article, I do not present research findings per se; rather, I use two interview excerpts for methodological and theoretical reflections. These interview excerpts are derived from a health promotion study in the Netherlands, which was partly based on peer-to-peer interviews. I conclude with a proposal to advance narrative health research by sensitizing researchers to different usages of both narrative and health, and the interrelationship(s) between the two.

  5. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  6. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    SciTech Connect

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; Ren, Gang

    2015-06-18

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.

  7. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    DOE PAGES

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; ...

    2015-06-18

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is amore » technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.« less

  8. The BIAS FREE Framework: a new analytical tool for global health research.

    PubMed

    Eichler, Margrit; Burke, Mary Anne

    2006-01-01

    To test the applicability of the BIAS FREE Framework in African settings. Researchers from the Tanzanian National Institute for Medical Research, university and community-based researchers from Tanzania, the Gambia and South Africa. National Institute for Medical Research, Dar es Salaam--Tanzania. An intensive two-day workshop to examine the applicability of the BIAS FREE Framework within an African setting. This involved clarification of the following concepts: construction of knowledge, objectivity, logic of domination, hierarchy, power, sex and gender, disability, and race/ethnicity. The Framework identifies three types of bias problems that derive from social hierarchies based on gender, race and disability: maintaining hierarchy, failing to examine differences, and using double standards. Participants used the 20 diagnostic questions at the heart of the Framework to analyze various research publications, including some authored by participants. Participants uniformly stated that the Framework is useful for uncovering bias in public health research, policy and programs; that it is immediately applicable in their work settings; and that doing so would improve equity in research and, ultimately, in health. One participant re-analyzed published data using the Framework and submitted a supplementary report with some new recommendations. The applicability of the BIAS FREE Framework has been demonstrated in diverse settings. It is now being offered for broader application as a tool for uncovering and eliminating biases in health research that derive from social hierarchies and for addressing the persistence of global health inequities.

  9. Researching and Doing Professional Development Using a Shared Discursive Resource and an Analytic Tool

    ERIC Educational Resources Information Center

    Adler, Jill

    2015-01-01

    Linked research and development forms the central pillar of the 5-year Wits Maths Connect Secondary Project in South Africa. Our empirical data emphasised the need for teaching that mediates towards mathematics viewed as a network of scientific concepts, and the development of the notion of 'mathematical discourse in instruction' (MDI), as an…

  10. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  11. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  12. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  13. Analytical tool requirements for power system restoration

    SciTech Connect

    Adibi, M.M. ); Borkoski, J.N. ); Kafka, R.J. )

    1994-08-01

    This paper is one of series presented by Power System Restoration Working Group (SRWG) on behalf of the System, Operation Subcommittee with the intent of focusing industry attention on power system restoration. In this paper a set of analytical tools is specified which together describe the static, transient and dynamic behavior of a power system during restoration. These tools are identified and described for restoration planning, training and operation. Their applications cover all stages of restoration including pre-disturbance condition, post-disturbance status, post-restoration target system, and minimization of unserved loads. The paper draws on the previous reports by the SRWG.

  14. Using the Conceptual Change Model of Learning as An Analytic Tool in Researching Teacher Preparation for Student Diversity

    ERIC Educational Resources Information Center

    Larkin, Douglas

    2012-01-01

    Background/Context: In regard to preparing prospective teachers for diverse classrooms, the agenda for teacher education research has been primarily concerned with identifying desired outcomes and promising strategies. Scholarship in multicultural education has been crucial for identifying the knowledge, skills, and attitudes needed by teachers to…

  15. Social Network Analysis as an Analytic Tool for Task Group Research: A Case Study of an Interdisciplinary Community of Practice

    ERIC Educational Resources Information Center

    Lockhart, Naorah C.

    2017-01-01

    Group counselors commonly collaborate in interdisciplinary settings in health care, substance abuse, and juvenile justice. Social network analysis is a methodology rarely used in counseling research yet has potential to examine task group dynamics in new ways. This case study explores the scholarly relationships among 36 members of an…

  16. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  17. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  18. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  19. Chemometrics tools used in analytical chemistry: an overview.

    PubMed

    Kumar, Naveen; Bansal, Ankit; Sarma, G S; Rawal, Ravindra K

    2014-06-01

    This article presents various important tools of chemometrics utilized as data evaluation tools generated by various hyphenated analytical techniques including their application since its advent to today. The work has been divided into various sections, which include various multivariate regression methods and multivariate resolution methods. Finally the last section deals with the applicability of chemometric tools in analytical chemistry. The main objective of this article is to review the chemometric methods used in analytical chemistry (qualitative/quantitative), to determine the elution sequence, classify various data sets, assess peak purity and estimate the number of chemical components. These reviewed methods further can be used for treating n-way data obtained by hyphenation of LC with multi-channel detectors. We prefer to provide a detailed view of various important methods developed with their algorithm in favor of employing and understanding them by researchers not very familiar with chemometrics.

  20. Analytical tools and isolation of TOF events

    NASA Technical Reports Server (NTRS)

    Wolf, H.

    1974-01-01

    Analytical tools are presented in two reports. The first is a probability analysis of the orbital distribution of events in relation to dust flux density observed in Pioneer 8 and 9 distributions. A distinction is drawn between asymmetries caused by random fluctuations and systematic variations, by calculating the probability of any particular asymmetry. The second article discusses particle trajectories for a repulsive force field. The force on a particle due to solar radiation pressure is directed along the particle's radius vector, from the sun, and is inversely proportional to its distance from the sun. Equations of motion which describe both solar radiation pressure and gravitational attraction are presented.

  1. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  2. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  3. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  4. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  5. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  7. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  8. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  9. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  10. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  11. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by…

  12. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by…

  13. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  14. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  15. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  16. Aircraft as Research Tools

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Aeronautical research usually begins with computers, wind tunnels, and flight simulators, but eventually the theories must fly. This is when flight research begins, and aircraft are the primary tools of the trade. Flight research involves doing precision maneuvers in either a specially built experimental aircraft or an existing production airplane that has been modified. For example, the AD-1 was a unique airplane made only for flight research, while the NASA F-18 High Alpha Research Vehicle (HARV) was a standard fighter aircraft that was transformed into a one-of-a-kind aircraft as it was fitted with new propulsion systems, flight controls, and scientific equipment. All research aircraft are able to perform scientific experiments because of the onboard instruments that record data about its systems, aerodynamics, and the outside environment. Since the 1970's, NASA flight research has become more comprehensive, with flights involving everything form Space Shuttles to ultralights. NASA now flies not only the fastest airplanes, but some of the slowest. Flying machines continue to evolve with new wing designs, propulsion systems, and flight controls. As always, a look at today's experimental research aircraft is a preview of the future.

  17. ANALYTICAL CHEMISTRY RESEARCH NEEDS FOR ...

    EPA Pesticide Factsheets

    The consensus among environmental scientists and risk assessors is that the fate and effects of pharmaceutical and personal care products (PPCPS) in the environment are poorly understood. Many classes of PPCPs have yet to be investigated. Acquisition of trends data for a suite of PPCPs (representatives from each of numerous significant classes), shown to recur amongst municipal wastewater treatment plants across the country, may prove of key importance. The focus of this paper is an overview of some of the analytical methods being developed at the Environmenental Protection Agency and their application to wastewater and surface water samples. Because PPCPs are generally micro-pollutants, emphasis is on development of enrichment and pre- concentration techniques using various means of solid-phase extraction. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCP

  18. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  19. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  20. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  1. Analytical tools for groundwater pollution assessment

    SciTech Connect

    Hantush, M.M.; Islam, M.R.; Marino, M.A.

    1998-06-01

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of ground water buffer strips. The indices describe the leaching of solutes below the root zone (mass fraction), emissions to the water table, and mass fraction of the contaminant intercepted by a well or a surface water body.

  2. Trial analytics--a tool for clinical trial management.

    PubMed

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  3. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  4. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  5. Electronic tongue: An analytical gustatory tool.

    PubMed

    Latha, Rewanthwar Swathi; Lakshmi, P K

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  6. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  7. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  8. Tool for Ranking Research Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott, Kelly; Smith, Harold

    2005-01-01

    Tool for Research Enhancement Decision Support (TREDS) is a computer program developed to assist managers in ranking options for research aboard the International Space Station (ISS). It could likely also be adapted to perform similar decision-support functions in industrial and academic settings. TREDS provides a ranking of the options, based on a quantifiable assessment of all the relevant programmatic decision factors of benefit, cost, and risk. The computation of the benefit for each option is based on a figure of merit (FOM) for ISS research capacity that incorporates both quantitative and qualitative inputs. Qualitative inputs are gathered and partly quantified by use of the time-tested analytical hierarchical process and used to set weighting factors in the FOM corresponding to priorities determined by the cognizant decision maker(s). Then by use of algorithms developed specifically for this application, TREDS adjusts the projected benefit for each option on the basis of levels of technical implementation, cost, and schedule risk. Based partly on Excel spreadsheets, TREDS provides screens for entering cost, benefit, and risk information. Drop-down boxes are provided for entry of qualitative information. TREDS produces graphical output in multiple formats that can be tailored by users.

  9. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  10. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  11. Pre-analytical workstations: a tool for reducing laboratory errors.

    PubMed

    Da Rin, Giorgio

    2009-06-01

    Laboratory testing, a highly complex process commonly called the total testing process (TTP), is usually subdivided into three traditional (pre-, intra-, and post-) analytical phases. The majority of errors in TTP originate in the pre-analytical phase, being due to individual or system design defects. In order to reduce errors in TTP, the pre-analytical phase should therefore be prioritized. In addition to developing procedures, providing training, improving interdepartmental cooperation, information technology and robotics may be a tool to reduce errors in specimen collection and pre-analytical sample handling. It has been estimated that >2000 clinical laboratories worldwide use total or subtotal automation supporting pre-analytic activities, with a high rate of increase compared to 2007; the need to reduce errors seems to be the catalyst for increasing the use of robotics. Automated systems to prevent medical personnel from drawing blood from the wrong patient were introduced commercially in the early 1990s. Correct patient identification and test tube labelling before phlebotomy are of extreme importance for patient safety in TTP, but currently few laboratories are interested in such products. At San Bassiano hospital, the implementation of advanced information technology and robotics in the pre-analytical phase (specimen collection and pre-analytical sample handling) have improved accuracy, and clinical efficiency of the laboratory process and created a TTP that minimizes errors.

  12. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  13. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays.

    PubMed

    Hsieh, Helen V; Dantzler, Jeffrey L; Weigl, Bernhard H

    2017-05-28

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor's office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  14. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  15. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  16. Analytical Research on Developmental Aspects of Metamemory.

    ERIC Educational Resources Information Center

    Plude, Dana J.; Nelson, Thomas O.; Scholnick, Ellin K.

    1998-01-01

    Reviews selected pioneering findings in the child-developmental and adulthood-aging literature and evaluates them within the framework of Nelson (Thomas O.) and Narens' (Louis) (1990) theory of metamemory. Makes suggestions for conceptually-based analytical research to help specify the mechanisms that underlie developmental differences in…

  17. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  18. An analytic decision support tool for resident allocation.

    PubMed

    Talay-Değirmenci, Işılay; Holmes, Casey J; Kuo, Paul C; Jennings, Otis B

    2013-01-01

    Moving residents through an academic residency program is complicated by a number of factors. Across all residency programs the percentage of residents that leave for any reason is between 3.4% and 3.8%.(1) There are a number of residents that participate in research. To avoid discrepancies in the number of residents at the all levels, programs must either limit the number of residents that go into the lab, the number that return to clinical duties, or the number of interns to hire. Traditionally this process consists of random selection and trial and error with names on magnetic strips moved around a board. With the matrix that we have developed this process is optimized and aided by a Microsoft Excel macro (Microsoft Corp, Redmond, Washington). We suggest that a residency program would have the same number of residents at each residency stage of clinical practice, as well as a steady number of residents at each research stage. The program consists of 2 phases, in the first phase, an Excel sheet called the "Brain Sheet," there are simple formulas that we have prepared to determine the number of interns to recruit, residents in the research phase, and residents that advance to the next stage of training. The second phase of the program, the macro, then takes the list of current resident names along with the residency level they are in, and according to the formulas allocates them to the relevant stages for future years, creating a resident matrix. Our macro for resident allocation would maximize the time of residency program administrators by simplifying the movement of residents through the program. It would also provide a tool for planning the number of new interns to recruit and program expansion. The application of our macro illustrates that analytical techniques can be used to minimize the time spent and avoid the trial and error while planning resident movement in a program. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier

  19. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  20. Hawkeye: an interactive visual analytics tool for genome assemblies

    PubMed Central

    2007-01-01

    Genome sequencing remains an inexact science, and genome sequences can contain significant errors if they are not carefully examined. Hawkeye is our new visual analytics tool for genome assemblies, designed to aid in identifying and correcting assembly errors. Users can analyze all levels of an assembly along with summary statistics and assembly metrics, and are guided by a ranking component towards likely mis-assemblies. Hawkeye is freely available and released as part of the open source AMOS project http://amos.sourceforge.net/hawkeye. PMID:17349036

  1. Geographical Information Systems: A Tool for Institutional Research.

    ERIC Educational Resources Information Center

    Prather, James E.; Carlson, Christina E.

    This paper addresses the application of Geographical Information Systems (GIS), a computerized tool for associating key information by geographical location, to the institutional research function at institutions of higher education. The first section investigates the potential of GIS as an analytical and planning tool for institutional…

  2. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  3. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  4. Mapping healthcare systems: a policy relevant analytic tool.

    PubMed

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses.

  5. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  6. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  7. A Tool for Medical Research

    NASA Technical Reports Server (NTRS)

    1992-01-01

    California Measurements, Inc.'s PC-2 Aerosol Particle Analyzer, developed by William Chiang, a former Jet Propulsion Laboratory (JPL) engineer, was used in a study to measure the size of particles in the medical environment. Chiang has a NASA license for the JPL crystal oscillator technology and originally built the instrument for atmospheric research. In the operating room, it enabled researchers from the University of California to obtain multiple sets of data repeatedly and accurately. The study concluded that significant amounts of aerosols are generated during surgery when power tools are employed, and most of these are in the respirable size. Almost all contain blood and are small enough to pass through surgical masks. Research on the presence of blood aerosols during oral surgery had similar results. Further studies are planned to determine the possibility of HIV transmission during surgery, and the PC-2H will be used to quantify blood aerosols.

  8. Experimental and analytical ion thruster research

    NASA Technical Reports Server (NTRS)

    Ruyten, Wilhelmus M.; Friedly, V. J.; Peng, Xiaohang; Keefer, Dennis

    1993-01-01

    The results of further spectroscopic studies on the plume from a 3 cm ion source operated on an argon propellant is reported on. In particular, it is shown that it should be possible to use the spectroscopic technique to measure the plasma density of the ion plume close to the grids, where it is difficult to use electrical probe measurements. How the technique, along with electrical probe measurements in the far downstream region of the plume, can be used to characterize the operation of a three-grid, 15 cm diameter thruster from NASA JPL is outlined. Pumping speed measurements on the Vacuum Research Facility have shown that this facility should be adequate for testing the JPL thruster at pressures in the low 10(exp -5) Torr range. Finally, we describe a simple analytical model which can be used to calculate the grid impingement current which results from charge-exchange collisions in the ion plume.

  9. Network analytical tool for monitoring global food safety highlights China.

    PubMed

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P

    2009-08-18

    The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i) capture complexity, ii) analyze trends, and iii) predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i) Google's PageRank algorithm and ii) the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  10. Analytical tools for single-molecule fluorescence imaging in cellulo.

    PubMed

    Leake, M C

    2014-07-07

    Recent technological advances in cutting-edge ultrasensitive fluorescence microscopy have allowed single-molecule imaging experiments in living cells across all three domains of life to become commonplace. Single-molecule live-cell data is typically obtained in a low signal-to-noise ratio (SNR) regime sometimes only marginally in excess of 1, in which a combination of detector shot noise, sub-optimal probe photophysics, native cell autofluorescence and intrinsically underlying stochasticity of molecules result in highly noisy datasets for which underlying true molecular behaviour is non-trivial to discern. The ability to elucidate real molecular phenomena is essential in relating experimental single-molecule observations to both the biological system under study as well as offering insight into the fine details of the physical and chemical environments of the living cell. To confront this problem of faithful signal extraction and analysis in a noise-dominated regime, the 'needle in a haystack' challenge, such experiments benefit enormously from a suite of objective, automated, high-throughput analysis tools that can home in on the underlying 'molecular signature' and generate meaningful statistics across a large population of individual cells and molecules. Here, I discuss the development and application of several analytical methods applied to real case studies, including objective methods of segmenting cellular images from light microscopy data, tools to robustly localize and track single fluorescently-labelled molecules, algorithms to objectively interpret molecular mobility, analysis protocols to reliably estimate molecular stoichiometry and turnover, and methods to objectively render distributions of molecular parameters.

  11. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  12. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  13. Development of computer-based analytical tool for assessing physical protection system

    SciTech Connect

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  14. MATRIICES - Mass Analytical Tool for Reactions in Interstellar ICES

    NASA Astrophysics Data System (ADS)

    Isokoski, K.; Bossa, J. B.; Linnartz, H.

    2011-05-01

    The formation of complex organic molecules (COMs) observed in the inter- and circumstellar medium (ISCM) is driven by a complex chemical network yet to be fully characterized. Interstellar dust grains and the surrounding ice mantles, subject to atom bombardment, UV irradiation, and thermal processing, are believed to provide catalytic sites for such chemistry. However, the solid state chemical processes and the level of complexity reachable under astronomical conditions remain poorly understood. The conventional laboratory techniques used to characterize the solid state reaction pathways - RAIRS (Reflection Absorption IR Spectroscopy) and TPD (Temperature-Programmed Desorption) - are suitable for the analysis of reactions in ices made of relatively small molecules. For more complex ices comprising a series of different components as relevant to the interstellar medium, spectral overlapping prohibits unambiguous identification of reaction schemes, and these techniques start to fail. Therefore, we have constructed a new and innovative experimental set up for the study of complex interstellar ices featuring a highly sensitive and unambiguous detection method. MATRIICES (Mass Analytical Tool for Reactions in Interstellar ICES) combines Laser Ablation technique with a molecular beam experiment and Time-Of-Flight Mass Spectrometry (LA-TOF-MS) to sample and analyze the ice analogues in situ, at native temperatures, under clean ultra-high vacuum conditions. The method allows direct sampling and analysis of the ice constituents in real time, by using a pulsed UV ablation laser (355-nm Nd:YAG) to vaporize the products in a MALDI-TOF like detection scheme. The ablated material is caught in a synchronously pulsed molecular beam of inert carrier gas (He) from a supersonic valve, and analysed in a Reflectron Time-of-Flight Mass Spectrometer. The detection limit of the method is expected to exceed that of the regular surface techniques substantially. The ultimate goal is to fully

  15. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data.

    PubMed

    L'Yi, Sehi; Ko, Bongkyung; Shin, DongHwa; Cho, Young-Joon; Lee, Jaeyong; Kim, Bohyoung; Seo, Jinwook

    2015-01-01

    Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results.

  16. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    PubMed Central

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  17. Scalable Combinatorial Tools for Health Disparities Research

    PubMed Central

    Langston, Michael A.; Levine, Robert S.; Kilbourne, Barbara J.; Rogers, Gary L.; Kershenbaum, Anne D.; Baktash, Suzanne H.; Coughlin, Steven S.; Saxton, Arnold M.; Agboto, Vincent K.; Hood, Darryl B.; Litchveld, Maureen Y.; Oyana, Tonny J.; Matthews-Juarez, Patricia; Juarez, Paul D.

    2014-01-01

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject. PMID:25310540

  18. Scalable combinatorial tools for health disparities research.

    PubMed

    Langston, Michael A; Levine, Robert S; Kilbourne, Barbara J; Rogers, Gary L; Kershenbaum, Anne D; Baktash, Suzanne H; Coughlin, Steven S; Saxton, Arnold M; Agboto, Vincent K; Hood, Darryl B; Litchveld, Maureen Y; Oyana, Tonny J; Matthews-Juarez, Patricia; Juarez, Paul D

    2014-10-10

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual's genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.

  19. DDN: a caBIG® analytical tool for differential network analysis.

    PubMed

    Zhang, Bai; Tian, Ye; Jin, Lu; Li, Huai; Shih, Ie-Ming; Madhavan, Subha; Clarke, Robert; Hoffman, Eric P; Xuan, Jianhua; Hilakivi-Clarke, Leena; Wang, Yue

    2011-04-01

    Differential dependency network (DDN) is a caBIG® (cancer Biomedical Informatics Grid) analytical tool for detecting and visualizing statistically significant topological changes in transcriptional networks representing two biological conditions. Developed under caBIG®'s In Silico Research Centers of Excellence (ISRCE) Program, DDN enables differential network analysis and provides an alternative way for defining network biomarkers predictive of phenotypes. DDN also serves as a useful systems biology tool for users across biomedical research communities to infer how genetic, epigenetic or environment variables may affect biological networks and clinical phenotypes. Besides the standalone Java application, we have also developed a Cytoscape plug-in, CytoDDN, to integrate network analysis and visualization seamlessly. The Java and MATLAB source code can be downloaded at the authors' web site http://www.cbil.ece.vt.edu/software.htm.

  20. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  1. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  2. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  3. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  4. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  5. MHK Research, Tools, and Methods

    SciTech Connect

    Jepsen, Richard

    2011-11-02

    Presentation from the 2011 Water Peer Review in which principal investigator discusses improved testing, analysis, and design tools needed to more accurately model operational conditions, to optimize design parameters, and predict technology viability.

  6. USEFULNESS OF ANALYTICAL RESEARCH. RETHINKING ANALYTICAL R&D&T STRATEGIES.

    PubMed

    Valcarcel, Miguel

    2017-09-27

    This opinion article is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline, and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.

  7. Academic Analytics: A New Tool for a New Era

    ERIC Educational Resources Information Center

    Campbell, John P.; DeBlois, Peter B.; Oblinger, Diana G.

    2007-01-01

    In responding to internal and external pressures for accountability in higher education, especially in the areas of improved learning outcomes and student success, IT leaders may soon become critical partners with academic and student affairs. IT can help answer this call for accountability through "academic analytics," which is emerging…

  8. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  9. Analytical tools for characterizing biopharmaceuticals and the implications for biosimilars

    PubMed Central

    Berkowitz, Steven A.; Engen, John R.; Mazzeo, Jeffrey R.; Jones, Graham B.

    2013-01-01

    Biologics such as monoclonal antibodies are much more complex than small-molecule drugs, which raises challenging questions for the development and regulatory evaluation of follow-on versions of such biopharmaceutical products (also known as biosimilars) and their clinical use once patent protection for the pioneering biologic has expired. With the recent introduction of regulatory pathways for follow-on versions of complex biologics, the role of analytical technologies in comparing biosimilars with the corresponding reference product is attracting substantial interest in establishing the development requirements for biosimilars. Here, we discuss the current state of the art in analytical technologies to assess three characteristics of protein biopharmaceuticals that regulatory authorities have identified as being important in development strategies for biosimilars: post-translational modifications, three-dimensional structures and protein aggregation. PMID:22743980

  10. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  11. Observatory Bibliographies as Research Tools

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, S. L.

    2013-01-01

    Traditionally, observatory bibliographies were maintained to provide insight in how successful a observatory is as measured by its prominence in the (refereed) literature. When we set up the bibliographic database for the Chandra X-ray Observatory (http://cxc.harvard.edu/cgi-gen/cda/bibliography) as part of the Chandra Data Archive ((http://cxc.harvard.edu/cda/), very early in the mission, our objective was to make it primarily a useful tool for our user community. To achieve this we are: (1) casting a very wide net in collecting Chandra-related publications; (2) including for each literature reference in the database a wealth of metadata that is useful for the users; and (3) providing specific links between the articles and the datasets in the archive that they use. As a result our users are able to browse the literature and the data archive simultaneously. As an added bonus, the rich metadata content and data links have also allowed us to assemble more meaningful statistics about the scientific efficacy of the observatory. In all this we collaborate closely with the Astrophysics Data System (ADS). Among the plans for future enhancement are the inclusion of press releases and the Chandra image gallery, linking with ADS semantic searching tools, full-text metadata mining, and linking with other observatories' bibliographies. This work is supported by NASA contract NAS8-03060 (CXC) and depends critically on the services provided by the ADS.

  12. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  13. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  14. A collaborative visual analytics suite for protein folding research.

    PubMed

    Harvey, William; Park, In-Hee; Rübel, Oliver; Pascucci, Valerio; Bremer, Peer-Timo; Li, Chenglong; Wang, Yusu

    2014-09-01

    Molecular dynamics (MD) simulation is a crucial tool for understanding principles behind important biochemical processes such as protein folding and molecular interaction. With the rapidly increasing power of modern computers, large-scale MD simulation experiments can be performed regularly, generating huge amounts of MD data. An important question is how to analyze and interpret such massive and complex data. One of the (many) challenges involved in analyzing MD simulation data computationally is the high-dimensionality of such data. Given a massive collection of molecular conformations, researchers typically need to rely on their expertise and prior domain knowledge in order to retrieve certain conformations of interest. It is not easy to make and test hypotheses as the data set as a whole is somewhat "invisible" due to its high dimensionality. In other words, it is hard to directly access and examine individual conformations from a sea of molecular structures, and to further explore the entire data set. There is also no easy and convenient way to obtain a global view of the data or its various modalities of biochemical information. To this end, we present an interactive, collaborative visual analytics tool for exploring massive, high-dimensional molecular dynamics simulation data sets. The most important utility of our tool is to provide a platform where researchers can easily and effectively navigate through the otherwise "invisible" simulation data sets, exploring and examining molecular conformations both as a whole and at individual levels. The visualization is based on the concept of a topological landscape, which is a 2D terrain metaphor preserving certain topological and geometric properties of the high dimensional protein energy landscape. In addition to facilitating easy exploration of conformations, this 2D terrain metaphor also provides a platform where researchers can visualize and analyze various properties (such as contact density) overlayed on the

  15. Improving web site performance using commercially available analytical tools.

    PubMed

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  16. Galileo's Discorsi as a Tool for the Analytical Art.

    PubMed

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  17. Polymerase chain reaction technology as analytical tool in agricultural biotechnology.

    PubMed

    Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping

    2005-01-01

    The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.

  18. Quality management system for application of the analytical quality assurance cycle in a research project

    NASA Astrophysics Data System (ADS)

    Camargo, R. S.; Olivares, I. R. B.

    2016-07-01

    The lack of quality assurance and quality control in academic activities have been recognized by the inability to demonstrate reproducibility. This paper aim to apply a quality tool called Analytical Quality Assurance Cycle on a specific research project, supported by a Verification Programme of equipment and an adapted Quality Management System based on international standards, to provide traceability to the data generated.

  19. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  20. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  1. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    PubMed

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns.

  2. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  3. Research as an educational tool

    SciTech Connect

    Neff, R.; Perlmutter, D.; Klaczynski, P.

    1994-12-31

    Our students have participated in original group research projects focused on the natural environment which culminate in a written manuscript published in-house, and an oral presentation to peers, faculty, and the university community. Our goal has been to develop their critical thinking skills so that they will be more successful in high school and college. We have served ninety-three students (47.1% white, 44.1% black, 5.4% hispanic, 2.2% American Indian, 1.2% asian) from an eight state region in the southeast over the past three years. Thirty-one students have graduated from high school with over 70% enrolled in college and another thirty-four are seniors this year. We are tracking students` progress in college and are developing our own critical thinking test to measure the impact of our program. Although preliminary, the results from the critical thinking test indicated that students are often prone to logical errors; however, higher levels of critical thinking were observed on items which raised issues that conflicted with students` pre-existing beliefs.

  4. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  5. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  6. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  7. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Planning Research on Student Services: Variety in Research Tools.

    ERIC Educational Resources Information Center

    Hom, Willard C.

    This paper discusses the seven types of research tools that have potential for advancing knowledge about student services in California Community Colleges. The seven tools are the following: literature review, data validation, survey research, case study, quasi experiment, meta analysis, and statistical modeling. The report gives reasons why each…

  9. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  10. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    PubMed Central

    2012-01-01

    Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics. PMID:23153033

  11. Enabling Research Tools for Sustained Climate Assessment

    NASA Technical Reports Server (NTRS)

    Leidner, Allison K.; Bosilovich, Michael G.; Jasinski, Michael F.; Nemani, Ramakrishna R.; Waliser, Duane Edward; Lee, Tsengdar J.

    2016-01-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  12. Enabling Research Tools for Sustained Climate Assessment

    NASA Astrophysics Data System (ADS)

    Leidner, A. K.; Bosilovich, M. G.; Jasinski, M. F.; Nemani, R. R.; Waliser, D. E.; Lee, T. J.

    2016-12-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  13. Bringing Research Tools into the Classroom

    ERIC Educational Resources Information Center

    Shubert, Charles; Ceraj, Ivica; Riley, Justin

    2009-01-01

    The advancement of computer technology used for research is creating the need to change the way classes are taught in higher education. "Bringing Research Tools into the Classroom" has become a major focus of the work of the Office of Educational Innovation and Technology (OEIT) for the Dean of Undergraduate Education (DUE) at the…

  14. Bringing Research Tools into the Classroom

    ERIC Educational Resources Information Center

    Shubert, Charles; Ceraj, Ivica; Riley, Justin

    2009-01-01

    The advancement of computer technology used for research is creating the need to change the way classes are taught in higher education. "Bringing Research Tools into the Classroom" has become a major focus of the work of the Office of Educational Innovation and Technology (OEIT) for the Dean of Undergraduate Education (DUE) at the…

  15. Army Ants as Research and Collection Tools

    PubMed Central

    Smith, Adrian A.; Haight, Kevin L.

    2008-01-01

    Ants that fall prey to the raids of army ants commonly respond by evacuating their nests. This documented behavior has been underexploited by researchers as an efficient research tool. This study focuses on the evacuation response of the southwestern desert ant Aphaenogaster cockerelli André (Hymenoptera: Formicidae) to the army ant Newamyrmex nigrescens Cresson. It is shown that army ants can be used to collect mature colonies of ants. The applicability of this tool to ecologically meaningful areas of research is discussed. PMID:20302457

  16. Chemometric classification techniques as a tool for solving problems in analytical chemistry.

    PubMed

    Bevilacqua, Marta; Nescatelli, Riccardo; Bucci, Remo; Magrì, Andrea D; Magrì, Antonio L; Marini, Federico

    2014-01-01

    Supervised pattern recognition (classification) techniques, i.e., the family of chemometric methods whose aim is the prediction of a qualitative response on a set of samples, represent a very important assortment of tools for solving problems in several areas of applied analytical chemistry. This paper describes the theory behind the chemometric classification techniques most frequently used in analytical chemistry together with some examples of their application to real-world problems.

  17. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  18. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  19. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  1. High fidelity simulation as a research tool.

    PubMed

    Littlewood, Keith E

    2011-12-01

    Medical simulation has grown explosively over the last decade. Simulation is becoming commonplace in clinical education but can also be used as an investigative clinical tool in its own right. There are thus two arms of simulation in clinical research. The first is investigation of the clinical impact of simulation as an educational tool and the second as an instrument to assess the function of clinical practitioners and systems. This article reviews the terminology, current practice and current research in simulation. The use of simulation in assessment of the clinical performance of devices, people and systems will then be discussed and some current work in these areas presented. Finally, medical simulation will be discussed within the paradigm of translational research. Early examples of this 'tool-bench to bedside' model will be presented as possible prototypes for future work directed towards patient safety.

  2. Biomedical research tools from the seabed.

    PubMed

    Folmer, Florence; Houssen, Wael E; Scott, Roderick H; Jaspars, Marcel

    2007-03-01

    This review covers the applications of small-molecule and peptidic compounds isolated from marine organisms for biomedical research. Enzymes and proteins from marine sources are already on the market for biomedical applications, but the use of small-molecule biomedical research tools of marine origin is less developed. For many studies involving these molecules the ultimate goal is the application of small-molecule therapeutics in the clinic, but those that do not succeed in the clinic still have clearly defined biological activities, which may be of use as biomedical research tools. In other cases, the investigation of marine-derived compounds has led directly to the discovery of therapeutics with clinical applications. Both as tools and therapeutics, these small-molecule compounds are effective for investigating biological processes, and in this review the authors have chosen to concentrate on the ability of marine natural products to affect membrane processes, ion channels and intracellular processes.

  3. The use of meta-analytical tools in risk assessment for food safety.

    PubMed

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  4. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  5. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  6. Subject Specific Databases: A Powerful Research Tool

    ERIC Educational Resources Information Center

    Young, Terrence E., Jr.

    2004-01-01

    Subject specific databases, or vortals (vertical portals), are databases that provide highly detailed research information on a particular topic. They are the smallest, most focused search tools on the Internet and, in recent years, they've been on the rise. Currently, more of the so-called "mainstream" search engines, subject directories, and…

  7. Next generation analytic tools for large scale genetic epidemiology studies of complex diseases.

    PubMed

    Mechanic, Leah E; Chen, Huann-Sheng; Amos, Christopher I; Chatterjee, Nilanjan; Cox, Nancy J; Divi, Rao L; Fan, Ruzong; Harris, Emily L; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M; McAllister, Kimberly; Moore, Jason H; Paltoo, Dina N; Province, Michael A; Ramos, Erin M; Ritchie, Marylyn D; Roeder, Kathryn; Schaid, Daniel J; Stephens, Matthew; Thomas, Duncan C; Weinberg, Clarice R; Witte, John S; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J; Gillanders, Elizabeth M

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled "Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases" on September 15-16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. © 2011 Wiley Periodicals, Inc.

  8. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  9. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  10. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  11. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Revisions to models and analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION PLAN...

  12. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Revisions to models and analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION PLAN...

  13. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Revisions to models and analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION PLAN...

  14. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  15. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  16. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  17. Using Virtual Observatory Tools for Astronomical Research

    NASA Astrophysics Data System (ADS)

    Kim, Sang Chul; Taylor, John D.; Panter, Benjamin; Sohn, Sangmo Tony; Heavens, Alan F.; Mann, Robert G.

    2005-06-01

    Construction of the Virtual Observatory (VO) is a great concern to the astronomical community in the 21st century. We present an outline of the concept and necessity of the VO and the current status of various VO projects including the 15 national ones and the International Virtual Observatory Alliance (IVOA). %, and of Grid project. We summarize the possible science cases that could be solved by using the VO data/tools, real science cases which are the results of using current VO tools, and our own work of using AstroGrid, the United Kingdom national VO, for a research on star formation history of galaxies.

  18. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    PubMed

    Barbic, Skye P; Durisko, Zachary; Andrews, Paul W

    2014-01-01

    Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR) is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. Data were high quality (<1% missing; high reliability: Cronbach's alpha = 0.92, test-retest intraclass correlations >0.81; evidence for divergent validity). Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07), with high reliability (rp = 0.86), ordered response scale structure, and no item bias (gender, age, time). Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ) that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  19. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    SciTech Connect

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  20. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  1. Group analytic psychotherapy (im)possibilities to research

    PubMed Central

    Vlastelica, Mirela

    2011-01-01

    In the course of group analytic psychotherapy, where we discovered the power of the therapeutic effects, there occurred the need of group analytic psychotherapy researches. Psychotherapeutic work in general, and group psychotherapy in particular, are hard to measure and put into some objective frames. Researches, i. e. measuring of changes in psychotherapy is a complex task, and there are large disagreements. For a long time, the empirical-descriptive method was the only way of research in the field of group psychotherapy. Problems of researches in group psychotherapy in general, and particularly in group analytic psychotherapy can be reviewed as methodology problems at first, especially due to unrepeatability of the therapeutic process. The basic polemics about measuring of changes in psychotherapy is based on the question whether a change is to be measured by means of open measuring of behaviour or whether it should be evaluated more finely by monitoring inner psychological dimensions. Following the therapy results up, besides providing additional information on the patient's improvement, strengthens the psychotherapist's self-respect, as well as his respectability and credibility as a scientist. PMID:25478094

  2. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  3. SEACOIN--an investigative tool for biomedical informatics researchers.

    PubMed

    Lee, Eva K; Lee, Hee-Rin; Quarshie, Alexander

    2011-01-01

    Peer-reviewed scientific literature is a prime source for accessing knowledge in the biomedical field. Its rapid growth and diverse domain coverage require systematic efforts in developing interactive tools for efficiently searching and summarizing current advances for acquiring knowledge and referencing, and for furthering scientific discovery. Although information retrieval systems exist, the conventional tools and systems remain difficult for biomedical investigators to use. There remain gaps even in the state-of-the-art systems as little attention has been devoted to understanding the needs of biomedical researchers. Our work attempts to bridge the gap between the needs of biomedical users and systems design efforts. We first study the needs of users and then design a simple visual analytic application tool, SEACOIN. A key motivation stems from biomedical researchers' request for a "simple interface" that is suitable for novice users in information technology. The system minimizes information overload, and allows users to search easily even in time-constrained situations. Users can manipulate the depth of information according to the purpose of usage. SEACOIN enables interactive exploration and filtering of search results via "metamorphose topological visualization" and "tag cloud," visualization tools that are commonly used in social network sites. We illustrate SEACOIN's usage through applications on PubMed publications on heart disease, cancer, Alzheimer's disease, diabetes, and asthma.

  4. Metabolomics, a Powerful Tool for Agricultural Research.

    PubMed

    Tian, He; Lam, Sin Man; Shui, Guanghou

    2016-11-17

    Metabolomics, which is based mainly on nuclear magnetic resonance (NMR), gas-chromatography (GC) or liquid-chromatography (LC) coupled to mass spectrometry (MS) analytical technologies to systematically acquire the qualitative and quantitative information of low-molecular-mass endogenous metabolites, provides a direct snapshot of the physiological condition in biological samples. As complements to transcriptomics and proteomics, it has played pivotal roles in agricultural and food science research. In this review, we discuss the capacities of NMR, GC/LC-MS in the acquisition of plant metabolome, and address the potential promise and diverse applications of metabolomics, particularly lipidomics, to investigate the responses of Arabidopsis thaliana, a primary plant model for agricultural research, to environmental stressors including heat, freezing, drought, and salinity.

  5. Metabolomics, a Powerful Tool for Agricultural Research

    PubMed Central

    Tian, He; Lam, Sin Man; Shui, Guanghou

    2016-01-01

    Metabolomics, which is based mainly on nuclear magnetic resonance (NMR), gas-chromatography (GC) or liquid-chromatography (LC) coupled to mass spectrometry (MS) analytical technologies to systematically acquire the qualitative and quantitative information of low-molecular-mass endogenous metabolites, provides a direct snapshot of the physiological condition in biological samples. As complements to transcriptomics and proteomics, it has played pivotal roles in agricultural and food science research. In this review, we discuss the capacities of NMR, GC/LC-MS in the acquisition of plant metabolome, and address the potential promise and diverse applications of metabolomics, particularly lipidomics, to investigate the responses of Arabidopsis thaliana, a primary plant model for agricultural research, to environmental stressors including heat, freezing, drought, and salinity. PMID:27869667

  6. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    SciTech Connect

    Brown, Forrest B.

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  7. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  8. Contemporary outcomes research: tools of the trade.

    PubMed

    Calkins, Casey M

    2008-05-01

    Outcomes are, simply put, why a surgeon comes to work each day. For decades, surgeons have insisted on a regular self-examination of outcomes to ensure the optimal treatment of our patients. Clinical research in pediatric surgery has largely subsisted on outcome analysis as it relates to the rudimentary end-result of an operation, utilizing variables such as mortality, operative time, specific complication rates, and hospital length of stay to name a few. Recently, outcomes research has become a more complex endeavor. This issue of Seminars in Pediatric Surgery addresses a wide array of these newfound complexities in contemporary outcomes research. The purpose of this review is to assist the pediatric surgeon in understanding the tools that are used in contemporary outcomes research and to be able to use this information to ask new questions of our patients and ourselves as we continue to strive for excellence in caring for sick infants and children.

  9. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  10. An Analytical Thermal Model for Autonomous Soaring Research

    NASA Technical Reports Server (NTRS)

    Allen, Michael

    2006-01-01

    A viewgraph presentation describing an analytical thermal model used to enable research on autonomous soaring for a small UAV aircraft is given. The topics include: 1) Purpose; 2) Approach; 3) SURFRAD Data; 4) Convective Layer Thickness; 5) Surface Heat Budget; 6) Surface Virtual Potential Temperature Flux; 7) Convective Scaling Velocity; 8) Other Calculations; 9) Yearly trends; 10) Scale Factors; 11) Scale Factor Test Matrix; 12) Statistical Model; 13) Updraft Strength Calculation; 14) Updraft Diameter; 15) Updraft Shape; 16) Smoothed Updraft Shape; 17) Updraft Spacing; 18) Environment Sink; 19) Updraft Lifespan; 20) Autonomous Soaring Research; 21) Planned Flight Test; and 22) Mixing Ratio.

  11. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  12. FOSS Tools for Research Data Management

    NASA Astrophysics Data System (ADS)

    Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2017-04-01

    Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.

  13. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Learning(CAOCL); Analytical Tools; Marine Corps Planning Process 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF a. REPORT b. ABSTRACT c. THIS...offers a clear opportunity for the enhanced integration of culture in planning. ―Design‖ thinking sets the stage for broad consideration and early...Staff College planning exercise. Either of these would be excellent sites to observe the actual use of the concept, and will set the stage for

  14. An analytical tool that quantifies cellular morphology changes from three-dimensional fluorescence images.

    PubMed

    Haass-Koffler, Carolina L; Naeemuddin, Mohammad; Bartlett, Selena E

    2012-08-31

    detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells, however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

  15. METABOLOMICS AS A DIAGNOSTIC TOOL FOR SMALL FISH TOXICOLOGY RESEARCH

    EPA Science Inventory

    Metabolomics involves the application of advanced analytical and statistical tools to profile changes in levels of endogenous metabolites in tissues and biofluids resulting from disease onset or stress. While certain metabolites are being specifically targeted in these studies, w...

  16. SEACOIN – An Investigative Tool for Biomedical Informatics Researchers

    PubMed Central

    Lee, Eva K.; Lee, Hee-Rin; Quarshie, Alexander

    2011-01-01

    Peer-reviewed scientific literature is a prime source for accessing knowledge in the biomedical field. Its rapid growth and diverse domain coverage require systematic efforts in developing interactive tools for efficiently searching and summarizing current advances for acquiring knowledge and referencing, and for furthering scientific discovery. Although information retrieval systems exist, the conventional tools and systems remain difficult for biomedical investigators to use. There remain gaps even in the state-of-the-art systems as little attention has been devoted to understanding the needs of biomedical researchers. Our work attempts to bridge the gap between the needs of biomedical users and systems design efforts. We first study the needs of users and then design a simple visual analytic application tool, SEACOIN. A key motivation stems from biomedical researchers’ request for a “simple interface” that is suitable for novice users in information technology. The system minimizes information overload, and allows users to search easily even in time-constrained situations. Users can manipulate the depth of information according to the purpose of usage. SEACOIN enables interactive exploration and filtering of search results via “metamorphose topological visualization” and “tag cloud,” visualization tools that are commonly used in social network sites. We illustrate SEACOIN’s usage through applications on PubMed publications on heart disease, cancer, Alzheimer’s disease, diabetes, and asthma. PMID:22195132

  17. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    PubMed

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results.

  18. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  19. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  20. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  1. Tools and collaborative environments for bioinformatics research

    PubMed Central

    Giugno, Rosalba; Pulvirenti, Alfredo

    2011-01-01

    Advanced research requires intensive interaction among a multitude of actors, often possessing different expertise and usually working at a distance from each other. The field of collaborative research aims to establish suitable models and technologies to properly support these interactions. In this article, we first present the reasons for an interest of Bioinformatics in this context by also suggesting some research domains that could benefit from collaborative research. We then review the principles and some of the most relevant applications of social networking, with a special attention to networks supporting scientific collaboration, by also highlighting some critical issues, such as identification of users and standardization of formats. We then introduce some systems for collaborative document creation, including wiki systems and tools for ontology development, and review some of the most interesting biological wikis. We also review the principles of Collaborative Development Environments for software and show some examples in Bioinformatics. Finally, we present the principles and some examples of Learning Management Systems. In conclusion, we try to devise some of the goals to be achieved in the short term for the exploitation of these technologies. PMID:21984743

  2. A Climate Data Analytical Tool Used to Validate CERES Clouds Pixel Level Data

    NASA Astrophysics Data System (ADS)

    Chu, C.

    2016-12-01

    CERES provides top of the atmosphere (TOA) measurements of both reflected-solar and emitted-thermal radiation. There are now 35 data years of Clouds and Earth's Radiant Energy System (CERES) observations across three satellites (Terra, Aqua, and Suomi NPP). These measurements are combined with clear and cloud properties from imager data (MODIS for Terra and Aqua and VIIRS for Suomi NPP) to obtain flux profiles through the atmosphere and at the Earth surface level products for each satellite overpass and gridded products which could be hourly, daily, monthly or a climatology. Because there are multiple versions (Edition3 and Edition4 for Terra and Aqua and Edition1 for NPP) of the various data products, an efficient climate data analytical tool is needed to support validation and analysis. The characteristics of this environment are (1) large number of parameters (around 1000), (2) large data volume that increased daily (approaching a PetaByte in total), (3) preprocessing data of the final or intermediate products before it is used for validation, (4) ability for users to manipulate the data such as filtering and differencing to identify issues, and (5) a display capability to show their work in progress. The tool has allowed CERES algorithm developers to browse large amount of data to identify issues and trends in a short period time. This poster will demonstrate how this analytical tool meets the above requirements with a search interface and a feature called "math" to process and filter the data on the fly.

  3. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  4. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  5. Interactive Poster: A Proposal for Sharing User Requirements for Visual Analytic Tools

    SciTech Connect

    Scholtz, Jean

    2009-10-11

    Although many in the community have advocated user-centered evaluations for visual analytic environments, a significant barrier exists. The users targeted by the visual analytics community (law enforcement personnel, professional information analysts, financial analysts, health care analysts, etc.) are often inaccessible to researchers. These analysts are extremely busy and their work environments and data are often classified or at least confidential. Furthermore, their tasks often last weeks or even months. It is simply not feasible to do such long-term observations to understand their jobs. How then can we hope to gather enough information about the diverse user populations to understand their needs? Some researchers have been successful in working with different end-users, including the author. A reasonable approach, therefore, would be to find a way to share user information. This paper outlines a proposal for developing a handbook of user profiles for use by researchers, developers, and evaluators.

  6. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  7. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  8. RNA "traffic lights": an analytical tool to monitor siRNA integrity.

    PubMed

    Holzhauser, Carolin; Liebl, Renate; Goepferich, Achim; Wagenknecht, Hans-Achim; Breunig, Miriam

    2013-05-17

    The combination of thiazole orange and thiazole red as an internal energy transfer-based fluorophore pair in oligonucleotides provides an outstanding analytical tool to follow DNA/RNA hybridization through a distinct fluorescence color change from red to green. Herein, we demonstrate that this concept can be applied to small interfering RNA (siRNA) to monitor RNA integrity in living cells in real time with a remarkable dynamic range and excellent contrast ratios in cellular media. Furthermore, we show that our siRNA-sensors still possess their gene silencing function toward the knockdown of enhanced green fluorescent protein in CHO-K1 cells.

  9. Analytical tools for the analysis of β-carotene and its degradation products.

    PubMed

    Stutz, H; Bresgen, N; Eckl, P M

    2015-05-01

    β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation.

  10. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  11. miRTarVis+: Web-based interactive visual analytics tool for microRNA target predictions.

    PubMed

    L'Yi, Sehi; Jung, Daekyoung; Oh, Minsik; Kim, Bohyoung; Freishtat, Robert J; Giri, Mamta; Hoffman, Eric; Seo, Jinwook

    2017-07-15

    In this paper, we present miRTarVis+, a Web-based interactive visual analytics tool for miRNA target predictions and integrative analyses of multiple prediction results. Various microRNA (miRNA) target prediction algorithms have been developed to improve sequence-based miRNA target prediction by exploiting miRNA-mRNA expression profile data. There are also a few analytics tools to help researchers predict targets of miRNAs. However, there still is a need for improving the performance for miRNA prediction algorithms and more importantly for interactive visualization tools for an integrative analysis of multiple prediction results. miRTarVis+ has an intuitive interface to support the analysis pipeline of load, filter, predict, and visualize. It can predict targets of miRNA by adopting Bayesian inference and maximal information-based nonparametric exploration (MINE) analyses as well as conventional correlation and mutual information analyses. miRTarVis+ supports an integrative analysis of multiple prediction results by providing an overview of multiple prediction results and then allowing users to examine a selected miRNA-mRNA network in an interactive treemap and node-link diagram. To evaluate the effectiveness of miRTarVis+, we conducted two case studies using miRNA-mRNA expression profile data of asthma and breast cancer patients and demonstrated that miRTarVis+ helps users more comprehensively analyze targets of miRNA from miRNA-mRNA expression profile data. miRTarVis+ is available at http://hcil.snu.ac.kr/research/mirtarvisplus. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Capillary electrophoresis as an analytical tool for monitoring nicotine in ATF regulated tobacco products.

    PubMed

    Ralapati, S

    1997-07-18

    Tobacco products are classified at different excise tax rates according to the Code of Federal Regulations. These include cigars, cigarettes, pipe tobacco, roll-your-own tobacco, chewing tobacco and snuff. Nicotine is the primary determinant of what constitutes a tobacco product from a regulatory standpoint. Determination of nicotine, therefore, is of primary importance and interest to ATF. Since nicotine is also the most abundant alkaloid found in tobacco, comprising about 98% of the total alkaloid content, a rapid method for the determination of nicotine in ATF regulated products is desirable. Capillary electrophoresis (CE), as an analytical technique, is rapidly gaining importance capturing the interest of analysts in several areas. The unique and powerful capabilities of CE including high resolution and short analysis times, make it a powerful analytical tool in the regulatory area as well. Preliminary studies using a 25 mM sodium phosphate buffer, pH 2.5 at 260 nm have yielded promising results for the analysis of nicotine in tobacco products. Application of an analytical method for the determination of nicotine by CE to ATF regulated tobacco products will be presented.

  13. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  14. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  15. New research tools for urogenital schistosomiasis.

    PubMed

    Rinaldi, Gabriel; Young, Neil D; Honeycutt, Jared D; Brindley, Paul J; Gasser, Robin B; Hsieh, Michael H

    2015-03-15

    Approximately 200,000,000 people have schistosomiasis (schistosome infection). Among the schistosomes, Schistosoma haematobium is responsible for the most infections, which are present in 110 million people globally, mostly in sub-Saharan Africa. This pathogen causes an astonishing breadth of sequelae: hematuria, anemia, dysuria, stunting, uremia, bladder cancer, urosepsis, and human immunodeficiency virus coinfection. Refined estimates of the impact of schistosomiasis on quality of life suggest that it rivals malaria. Despite S. haematobium's importance, relevant research has lagged. Here, we review advances that will deepen knowledge of S. haematobium. Three sets of breakthroughs will accelerate discoveries in the pathogenesis of urogenital schistosomiasis (UGS): (1) comparative genomics, (2) the development of functional genomic tools, and (3) the use of animal models to explore S. haematobium-host interactions. Comparative genomics for S. haematobium is feasible, given the sequencing of multiple schistosome genomes. Features of the S. haematobium genome that are conserved among platyhelminth species and others that are unique to S. haematobium may provide novel diagnostic and drug targets for UGS. Although there are technical hurdles, the integrated use of these approaches can elucidate host-pathogen interactions during this infection and can inform the development of techniques for investigating schistosomes in their human and snail hosts and the development of therapeutics and vaccines for the control of UGS. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. VAO Tools Enhance CANDELS Research Productivity

    NASA Astrophysics Data System (ADS)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  17. Some Tooling for Manufacturing Research Reactor Fuel Plates

    SciTech Connect

    Knight, R.W.

    1999-10-03

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment.

  18. Analytical Tools and Databases for Metagenomics in the Next-Generation Sequencing Era

    PubMed Central

    Kim, Mincheol; Lee, Ki-Hyun; Yoon, Seok-Whan; Kim, Bong-Soo; Chun, Jongsik

    2013-01-01

    Metagenomics has become one of the indispensable tools in microbial ecology for the last few decades, and a new revolution in metagenomic studies is now about to begin, with the help of recent advances of sequencing techniques. The massive data production and substantial cost reduction in next-generation sequencing have led to the rapid growth of metagenomic research both quantitatively and qualitatively. It is evident that metagenomics will be a standard tool for studying the diversity and function of microbes in the near future, as fingerprinting methods did previously. As the speed of data accumulation is accelerating, bioinformatic tools and associated databases for handling those datasets have become more urgent and necessary. To facilitate the bioinformatics analysis of metagenomic data, we review some recent tools and databases that are used widely in this field and give insights into the current challenges and future of metagenomics from a bioinformatics perspective. PMID:24124405

  19. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining

    PubMed Central

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-01-01

    Background New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. Results We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph

  20. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    PubMed

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  1. Online Analytical Processing (OLAP): A Fast and Effective Data Mining Tool for Gene Expression Databases

    PubMed Central

    2005-01-01

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB. PMID:16046824

  2. ENTVis: A Visual Analytic Tool for Entropy-Based Network Traffic Anomaly Detection.

    PubMed

    Zhou, Fangfang; Huang, Wei; Zhao, Ying; Shi, Yang; Liang, Xing; Fan, Xiaoping

    2015-01-01

    Entropy-based traffic metrics have received substantial attention in network traffic anomaly detection because entropy can provide fine-grained metrics of traffic distribution characteristics. However, some practical issues--such as ambiguity, lack of detailed distribution information, and a large number of false positives--affect the application of entropy-based traffic anomaly detection. In this work, we introduce a visual analytic tool called ENTVis to help users understand entropy-based traffic metrics and achieve accurate traffic anomaly detection. ENTVis provides three coordinated views and rich interactions to support a coherent visual analysis on multiple perspectives: the timeline group view for perceiving situations and finding hints of anomalies, the Radviz view for clustering similar anomalies in a period, and the matrix view for understanding traffic distributions and diagnosing anomalies in detail. Several case studies have been performed to verify the usability and effectiveness of our method. A further evaluation was conducted via expert review.

  3. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  4. The combined use of analytical tools for exploring tetanus toxin and tetanus toxoid structures.

    PubMed

    Bayart, Caroline; Peronin, Sébastien; Jean, Elisa; Paladino, Joseph; Talaga, Philippe; Borgne, Marc Le

    2017-06-01

    Aldehyde detoxification is a process used to convert toxin into toxoid for vaccine applications. In the case of tetanus toxin (TT), formaldehyde is used to obtain the tetanus toxoid (TTd), which is used either for the tetanus vaccine or as carrier protein in conjugate vaccines. Several studies have already been conducted to better understand the exact mechanism of this detoxification. Those studies led to the identification of a number of formaldehyde-induced modifications on lab scale TTd samples. To obtain greater insights of the changes induced by formaldehyde, we used three industrial TTd batches to identify repeatable modifications in the detoxification process. Our strategy was to combine seven analytical tools to map these changes. Mass spectrometry (MS), colorimetric test and amino acid analysis (AAA) were used to study modifications on amino acids. SDS-PAGE, asymmetric flow field flow fractionation (AF4), fluorescence spectroscopy and circular dichroism (CD) were used to study formaldehyde modifications on the whole protein structure. We identified 41 formaldehyde-induced modifications across the 1315 amino acid primary sequence of TT. Of these, five modifications on lysine residues were repeatable across TTd batches. Changes in protein conformation were also observed using SDS-PAGE, AF4 and CD techniques. Each analytical tool brought a piece of information regarding formaldehyde induced-modifications, and all together, these methods provided a comprehensive overview of the structural changes that occurred with detoxification. These results could be the first step leading to site-directed TT mutagenesis studies that may enable the production of a non-toxic equivalent protein without using formaldehyde. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  6. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. Copyright 2014, SLACK Incorporated.

  7. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  8. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    PubMed

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  9. Streamlining Research by Using Existing Tools

    PubMed Central

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and multi-site collaborations from scratch—reinventing the wheel. Our team developed a compendium of resources to address inefficiencies and researchers’ unmet needs and compiled them in a research toolkit website (www.ResearchToolkit.org). Through our work, we identified philosophical and operational issues related to disseminating the toolkit to the research community. We explore these issues here, with implications for the nation’s investment in biomedical research. PMID:21884513

  10. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    PubMed

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  11. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  12. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  13. Moving research tools into practice: the successes and challenges in promoting uptake of classification tools.

    PubMed

    Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter

    2017-01-27

    In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.

  14. Taking turns across channels: Conversation-analytic tools in animal communication.

    PubMed

    Fröhlich, Marlen

    2017-05-10

    In the quest to bridge the gulf between the fields of linguistics and animal communication, interest has recently been drawn to turn-taking behavior in social interaction. Vocal turn-taking is the core form of language usage in humans, and has been examined in numerous species of birds and primates. Recent studies on great apes have shown that they engage in a bodily form, gestural turn-taking, to achieve mutual communicative goals. However, most studies on turn-taking neglected the fact that signals are prevalently perceived and produced in a multimodal format. Here, I propose that research on animal communication may benefit a more holistic and dynamic approach: studying turn-taking using a multimodal, conservation-analytic framework. I will discuss recent comparative research that implemented this approach via a specific set of parameters. In sum, I argue that a conversation-analytic framework might help substantially to pinpoint the ways in which crucial components of language are embodied in the 'human interaction engine'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Assessment of catchments' flooding potential: a physically-based analytical tool

    NASA Astrophysics Data System (ADS)

    Botter, G.; Basso, S.; Schirmer, M.

    2016-12-01

    The assessment of the flooding potential of river catchments is critical in many research and applied fields, ranging from river science and geomorphology to urban planning and the insurance industry. Predicting magnitude and frequency of floods is key to prevent and mitigate the negative effects of high flows, and has therefore long been the focus of hydrologic research. Here, the recurrence intervals of seasonal flow maxima are estimated through a novel physically-based analytic approach, which links the extremal distribution of streamflows to the stochastic dynamics of daily discharge. An analytical expression of the seasonal flood-frequency curve is provided, whose parameters embody climate and landscape attributes of the contributing catchment and can be estimated from daily rainfall and streamflow data. Only one parameter, which expresses catchment saturation prior to rainfall events, needs to be calibrated on the observed maxima. The method has been tested in a set of catchments featuring heterogeneous daily flow regimes. The model is able to reproduce characteristic shapes of flood-frequency curves emerging in erratic and persistent flow regimes and provides good estimates of seasonal flow maxima in different climatic regions. Performances are steady when the magnitude of events with return times longer than the available sample size is estimated. This makes the approach especially valuable for regions affected by data scarcity.

  16. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    PubMed

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    2017-06-02

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  18. Sociometry: Tools for Research and Practice.

    ERIC Educational Resources Information Center

    Treadwell, Thomas W.; Kumar, V. K.; Stein, Steven A.; Prosnick, Kevin

    1997-01-01

    Reviews basic sociometric tools and their analysis, provides information on computer programs to analyze sociometric data, and briefly examines considerations in conducting sociometric investigations. Looks at the social atom (significant others), constructing sociometry questions, and offers an analysis of individual status and interactional…

  19. Tools for Ephemeral Gully Erosion Process Research

    USDA-ARS?s Scientific Manuscript database

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  20. Key statistical and analytical issues for evaluating treatment effects in periodontal research.

    PubMed

    Tu, Yu-Kang; Gilthorpe, Mark S

    2012-06-01

    Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.

  1. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  2. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  3. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  4. Chromatography as an analytical tool for selected antibiotic classes: a reappraisal addressed to pharmacokinetic applications.

    PubMed

    Marzo, A; Dal Bo, L

    1998-07-03

    The first antibiotic discovered, penicillin, appeared on the market just after the Second World War. Intensive research in subsequent years led to the discovery and development of cephalosporins, aminoglycosides, tetracyclines and rifamycin. The chemotherapeutic quinolones and the more recently discovered fluoroquinolones have added promising new therapeutic weapons to fight the microbial challenge. The major role pharmacokinetics has played in developing these compounds should be highlighted. Plasma concentration-time profiles and the therapeutic activity evoked by these compounds allow the therapeutic window, doses and dose turnovers to be appropriately defined as well as possible dose adjustment to be made in renal failure. The pharmacokinetics of antimicrobial agents were initially explored by using microbiological methods, but these lack specificity. The HPLC technique with UV, fluorometric, electrochemical and, in some cases, mass spectrometry detection has satisfactory solved the problem of antimicrobial agent assay for pharmacokinetic, bioavailability and bioequivalence purposes alike. Indeed, in these studies, plasma concentrations of the given analyte must be followed up for a period > or = 3 times the half-life, which calls for specific sensitive assays. In the review, the authors have described the analytical methods employed in the pharmacokinetics of antibiotics, including some chemotherapeutic agents which are used in medical practice as alternatives to antibiotics. The pharmacokinetic characteristics of each class of drugs are also briefly described, and some historical and chemical notes on the various classes are given.

  5. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  6. Database tools in genetic diseases research.

    PubMed

    Bianco, Anna Monica; Marcuzzi, Annalisa; Zanin, Valentina; Girardelli, Martina; Vuch, Josef; Crovella, Sergio

    2013-02-01

    The knowledge of the human genome is in continuous progression: a large number of databases have been developed to make meaningful connections among worldwide scientific discoveries. This paper reviews bioinformatics resources and database tools specialized in disseminating information regarding genetic disorders. The databases described are useful for managing sample sequences, gene expression and post-transcriptional regulation. In relation to data sets available from genome-wide association studies, we describe databases that could be the starting point for developing studies in the field of complex diseases, particularly those in which the causal genes are difficult to identify.

  7. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    ERIC Educational Resources Information Center

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  8. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    ERIC Educational Resources Information Center

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  9. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    NASA Astrophysics Data System (ADS)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  10. Scanning electron microscopy as an analytical tool for the study of calcified intrauterine contraceptive devices

    SciTech Connect

    Khan, S.R.; Wilkinson, E.J.

    1985-01-01

    Within the endometrial cavity intrauterine contraceptive devices (IUDs) become encrusted with cellular, acellular, and fibrillar substances. Scanning electron microscopy was used to study the crust. Cellular material consisted mainly of blood cells and various types of bacteria. The fibrillar material appeared to be fibrin which was omnipresent in the crust and formed a thin layer immediately over the IUD surface. X-ray microanalysis of the acellular component of the crust revealed the presence of calcium. No other major peaks were identified. Near the IUD surface characteristic calcium phosphate crystals were present. Their microanalysis showed peaks for calcium and phosphorus. X-ray diffraction of the crust however, showed it to contain only calcite. It is through the use of scanning electron microscopy that calcium phosphate has been detected in the IUD crust and a fibrillar layer has been visualized on the IUD surface. This study further demonstrates the effectiveness of SEM analytical techniques in the area of biomedical research.

  11. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  12. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  13. The role of the automation development group in analytical research and development at Dupont Merck.

    PubMed

    Lynch, J C; Green, J S; Hovsepian, P K; Reilly, K L; Short, J A

    1994-01-01

    Laboratory robotics has been firmly established in many non-QC laboratories as a valuable tool for automating pharmaceutical dosage form analysis. Often a single project or product line is used to justify an initial robot purchase thus introducing robotics to the laboratory for the first time. However, to gain widespread acceptance within the laboratory and to justify further investment in robotics, existing robots must be used to develop analyses for existing manual methods as well as new projects beyond the scope off the original purchase justification. The Automation Development Group in Analytical Research and Development is a team of analysts primarily devoted to developing new methods and adapting existing methods for the robot. This team approach developed the expertise and synergy necessary to significantly expand the contribution of robotics to automation in the authors' laboratory.

  14. The role of the automation development group in analytical research and development at Dupont Merck

    PubMed Central

    Lynch, John C.; Green, Jonathan S.; Hovsepian, Paul K.; Reilly, Kathleen L.; Short, Joseph A.

    1994-01-01

    Laboratory robotics has been firmly established in many non-QC laboratories as a valuable tool for automating pharmaceutical dosage form analysis. Often a single project or product line is used to justify an initial robot purchase thus introducing robotics to the laboratory for the first time. However, to gain widespread acceptance within the laboratory and to justify further investment in robotics, existing robots must be used to develop analyses for existing manual methods as well as new projects beyond the scope off the original purchase justification. The Automation Development Group in Analytical Research and Development is a team of analysts primarily devoted to developing new methods and adapting existing methods for the robot. This team approach developed the expertise and synergy necessary to significantly expand the contribution of robotics to automation in the authors' laboratory. PMID:18924999

  15. Nonnegative matrix factorization: an analytical and interpretive tool in computational biology.

    PubMed

    Devarajan, Karthik

    2008-07-25

    In the last decade, advances in high-throughput technologies such as DNA microarrays have made it possible to simultaneously measure the expression levels of tens of thousands of genes and proteins. This has resulted in large amounts of biological data requiring analysis and interpretation. Nonnegative matrix factorization (NMF) was introduced as an unsupervised, parts-based learning paradigm involving the decomposition of a nonnegative matrix V into two nonnegative matrices, W and H, via a multiplicative updates algorithm. In the context of a pxn gene expression matrix V consisting of observations on p genes from n samples, each column of W defines a metagene, and each column of H represents the metagene expression pattern of the corresponding sample. NMF has been primarily applied in an unsupervised setting in image and natural language processing. More recently, it has been successfully utilized in a variety of applications in computational biology. Examples include molecular pattern discovery, class comparison and prediction, cross-platform and cross-species analysis, functional characterization of genes and biomedical informatics. In this paper, we review this method as a data analytical and interpretive tool in computational biology with an emphasis on these applications.

  16. Nonnegative Matrix Factorization: An Analytical and Interpretive Tool in Computational Biology

    PubMed Central

    Devarajan, Karthik

    2008-01-01

    In the last decade, advances in high-throughput technologies such as DNA microarrays have made it possible to simultaneously measure the expression levels of tens of thousands of genes and proteins. This has resulted in large amounts of biological data requiring analysis and interpretation. Nonnegative matrix factorization (NMF) was introduced as an unsupervised, parts-based learning paradigm involving the decomposition of a nonnegative matrix V into two nonnegative matrices, W and H, via a multiplicative updates algorithm. In the context of a p×n gene expression matrix V consisting of observations on p genes from n samples, each column of W defines a metagene, and each column of H represents the metagene expression pattern of the corresponding sample. NMF has been primarily applied in an unsupervised setting in image and natural language processing. More recently, it has been successfully utilized in a variety of applications in computational biology. Examples include molecular pattern discovery, class comparison and prediction, cross-platform and cross-species analysis, functional characterization of genes and biomedical informatics. In this paper, we review this method as a data analytical and interpretive tool in computational biology with an emphasis on these applications. PMID:18654623

  17. Introducing diffusing wave spectroscopy as a process analytical tool for pharmaceutical emulsion manufacturing.

    PubMed

    Reufer, Mathias; Machado, Alexandra H E; Niederquell, Andreas; Bohnenblust, Katharina; Müller, Beat; Völker, Andreas Charles; Kuentz, Martin

    2014-12-01

    Emulsions are widely used for pharmaceutical, food, and cosmetic applications. To guarantee that their critical quality attributes meet specifications, it is desirable to monitor the emulsion manufacturing process. However, finding of a suitable process analyzer has so far remained challenging. This article introduces diffusing wave spectroscopy (DWS) as an at-line technique to follow the manufacturing process of a model oil-in-water pharmaceutical emulsion containing xanthan gum. The DWS results were complemented with mechanical rheology, microscopy analysis, and stability tests. DWS is an advanced light scattering technique that assesses the microrheology and in general provides information on the dynamics and statics of dispersions. The obtained microrheology results showed good agreement with those obtained with bulk rheology. Although no notable changes in the rheological behavior of the model emulsions were observed during homogenization, the intensity correlation function provided qualitative information on the evolution of the emulsion dynamics. These data together with static measurements of the transport mean free path (l*) correlated very well with the changes in droplet size distribution occurring during the emulsion homogenization. This study shows that DWS is a promising process analytical technology tool for development and manufacturing of pharmaceutical emulsions.

  18. Development of rocket electrophoresis technique as an analytical tool in preformulation study of tetanus vaccine formulation.

    PubMed

    Ahire, V J; Sawant, K K

    2006-08-01

    Rocket Electrophoresis (RE) technique relies on the difference in charges of the antigen and antibodies at the selected pH. The present study involves optimization of RE run conditions for Tetanus Toxoid (TT). Agarose gel (1% w/v, 20 ml, pH 8.6), anti-TT IgG - 1 IU/ml, temperature 4-8 degrees C and run duration of 18 h was found to be optimum. Height of the rocket-shaped precipitate was proportional to TT concentration. The RE method was found to be linear in the concentration range of 2.5 to 30 Lf/mL. The method was validated and found to be accurate, precise, and reproducible when analyzed statistically using student's t-test. RE was used as an analytical method for analyzing TT content in plain and marketed formulations as well as for the preformulation study of vaccine formulation where formulation additives were tested for compatibility with TT. The optimized RE method has several advantages: it uses safe materials, is inexpensive, and easy to perform. RE results are less prone to operator's bias as compared to flocculation test and can be documented by taking photographs and scanned by densitometer; RE can be easily standardized for the required antigen concentration by changing antitoxin concentration. It can be used as a very effective tool for qualitative and quantitative analysis and in preformulation studies of antigens.

  19. Common plants as alternative analytical tools to monitor heavy metals in soil

    PubMed Central

    2012-01-01

    Background Herbaceous plants are common vegetal species generally exposed, for a limited period of time, to bioavailable environmental pollutants. Heavy metals contamination is the most common form of environmental pollution. Herbaceous plants have never been used as natural bioindicators of environmental pollution, in particular to monitor the amount of heavy metals in soil. In this study, we aimed at assessing the usefulness of using three herbaceous plants (Plantago major L., Taraxacum officinale L. and Urtica dioica L.) and one leguminous (Trifolium pratense L.) as alternative indicators to evaluate soil pollution by heavy metals. Results We employed Inductively Coupled Plasma Atomic Emission Spectroscopy (ICP-AES) to assess the concentration of selected heavy metals (Cu, Zn, Mn, Pb, Cr and Pd) in soil and plants and we employed statistical analyses to describe the linear correlation between the accumulation of some heavy metals and selected vegetal species. We found that the leaves of Taraxacum officinale L. and Trifolium pratense L. can accumulate Cu in a linearly dependent manner with Urtica dioica L. representing the vegetal species accumulating the highest fraction of Pb. Conclusions In this study we demonstrated that common plants can be used as an alternative analytical tool for monitoring selected heavy metals in soil. PMID:22594441

  20. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  1. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    PubMed

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  3. Evaluation of angiotensin II receptor blockers for drug formulary using objective scoring analytical tool.

    PubMed

    Lim, Tsuey M; Ibrahim, Mohamed I

    2012-07-01

    Drug selection methods with scores have been developed and used worldwide for formulary purposes. These tools focus on the way in which the products are differentiated from each other within the same therapeutic class. Scoring Analytical Tool (SAT) is designed based on the same principle with score and is able to assist formulary committee members in evaluating drugs either to add or delete in a more structured, consistent and reproducible manner. To develop an objective SAT to facilitate evaluation of drug selection for formulary listing purposes. A cross-sectional survey was carried out. The proposed SAT was developed to evaluate the drugs according to pre-set criteria and sub-criteria that were matched to the diseases concerned and scores were then assigned based on their relative importance. The main criteria under consideration were safety, quality, cost and efficacy. All these were converted to questionnaires format. Data and information were collected through self-administered questionnaires that were distributed to medical doctors and specialists from the established public hospitals. A convenient sample of 167 doctors (specialists and non-specialists) were taken from various disciplines in the outpatient clinics such as Medical, Nephrology and Cardiology units who prescribed ARBs hypertensive drugs to patients. They were given a duration of 4 weeks to answer the questionnaires at their convenience. One way ANOVA, Kruskal Wallis and post hoc comparison tests were carried out at alpha level 0.05. Statistical analysis showed that the descending order of ARBs preference was Telmisartan or Irbesartan or Losartan, Valsartan or Candesartan, Olmesartan and lastly Eprosartan. The most cost saving ARBs for hypertension in public hospitals was Irbesartan. SAT is a tool which can be used to reduce the number of drugs and retained the most therapeutically appropriate drugs in the formulary, to determine most cost saving drugs and has the potential to complement the

  4. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be... used as part of an analysis, an assessment of the predictive capabilities of the fire models must...

  5. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    ERIC Educational Resources Information Center

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  6. Research-Based Communication Tool Kit

    ERIC Educational Resources Information Center

    Brown, Sherry; Campbell-Zopf, Mary; Hooper, Jeffrey; Marshall, David; McLaughlin, Beck

    2007-01-01

    Significant research over the last decade has built a strong case for the value of arts learning. Major summaries, including "Schools, Communities, and the Arts" (1995); "Champions of Change" (2000); "The Arts in Education: Evaluating the Evidence for a Causal Link" (2000); "Critical Links" (2002); and now "Critical Evidence: How the Arts Benefit…

  7. Visualization tools for comprehensive test ban treaty research

    SciTech Connect

    Edwards, T.L.; Harris, J.M.; Simons, R.W.

    1997-08-01

    This paper focuses on tools used in Data Visualization efforts at Sandia National Laboratories under the Department of Energy CTBT R&D program. These tools provide interactive techniques for the examination and interpretation of scientific data, and can be used for many types of CTBT research and development projects. We will discuss the benefits and drawbacks of using the tools to display and analyze CTBT scientific data. While the tools may be used for everyday applications, our discussion will focus on the use of these tools for visualization of data used in research and verification of new theories. Our examples focus on uses with seismic data, but the tools may also be used for other types of data sets. 5 refs., 6 figs., 1 tab.

  8. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  9. Sea otter research methods and tools

    USGS Publications Warehouse

    Bodkin, James L.; Maldini, Daniela; Calkins, Donald; Atkinson, Shannon; Meehan, Rosa

    2004-01-01

    Sea otters possess physical characteristics and life history attributes that provide both opportunity and constraint to their study. Because of their relatively limited diving ability they occur in nearshore marine habitats that are usually viewable from shore, allowing direct observation of most behaviors. Because sea otters live nearshore and forage on benthic invertebrates, foraging success and diet are easily measured. Because they rely almost exclusively on their pelage for insulation, which requires frequent grooming, successful application of external tags or instruments has been limited to attachments in the interdigital webbing of the hind flippers. Techniques to surgically implant instruments into the intraperitoneal cavity are well developed and routinely applied. Because they have relatively small home ranges and rest in predictable areas, they can be recaptured with some predictability using closed-circuit scuba diving technology. The purpose of this summary is to identify some of the approaches, methods, and tools that are currently engaged for the study of sea otters, and to suggest potential avenues for applying advancing technologies.

  10. Patenting genome research tools and the law.

    PubMed

    Eisenberg, Rebecca

    2003-01-01

    Patenting genes encoding therapeutic proteins was relatively uncontroversial in the early days of biotechnology. Controversy arose in the era of high-throughput DNA sequencing, when gene patents started to look less like patents on drugs and more like patents on scientific information. Evolving scientific and business strategies for exploiting genomic information raised concerns that patents might slow subsequent research. The trend towards stricter enforcement of the utility and disclosure requirements by the patent offices should help clarify the current confusion.

  11. Experimental and Analytical Research on Fracture Processes in ROck

    SciTech Connect

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  12. Acceleration in USA Sea-Level? New Insights Now Available Using Improved Analytical Tools and Methods

    NASA Astrophysics Data System (ADS)

    Watson, P. J.

    2016-12-01

    The detection of acceleration in mean sea-level around the data rich margins of the USA has been a keen endeavour of sea-level researchers post the seminal work of Bruce Douglas in 1992. Over the past decade, such investigations have taken on greater prominence given mean sea-level remains a key proxy by which to measure a changing climate system. The physics-based climate projection models are forecasting that the current global average rate of mean sea-level rise (≈ 3mm/year) might climb to rates in the range of 10-20 mm/year by 2100. Most research in this area has centred on reconciling current rates of rise with the significant accelerations required to meet the forecast projections of climate models. Various studies conducted over the past decade have provided inconsistent results which in part are due to both the small kinematic properties of the mean sea level signal evident in historical time series data and the limited analytical techniques applied to date to measure these phenomena. The analysis presented is based on a recently developed analytical package titled `msltrend', designed to augment climate change research by significantly enhancing estimates of trend, real-time velocity and acceleration in the relative mean sea-level signal derived from long annual average ocean water level time series. Key findings are that at the 95% confidence level, there is no consistent or substantial evidence (yet) that recent rates of rise are higher or abnormal in the context of the network of lengthy historical records available for the USA, nor is there any evidence that geocentric rates of rise are above the global average. The analysis also points to clearer spatial and temporal patterns in measured mean sea level around mainland USA than previously available. It is likely a further 20 years of data will distinguish whether recent increases east of Galveston and along the east coast are evidence of the onset of climate change induced acceleration.

  13. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  14. Simulation tools for robotics research and assessment

    NASA Astrophysics Data System (ADS)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  15. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  16. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  17. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  18. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    PubMed

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation.

  19. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    PubMed

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-08-03

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  20. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  1. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  2. The Critical Research Evaluation Tool (CRET): A Teaching and Learning Tool to Evaluate Research for Cultural Competence.

    PubMed

    Love, Katie L

    The aim of this study was to present the Critical Research Evaluation Tool (CRET) which teaches evaluation of the researchers' worldview, applicability to multicultural populations, and ethics surrounding potential harms to communities. To provide best cultural care nurses' need to understand how historical/social/political experiences impact health and also influence research. The Student using the CRET reported receiving a strong foundation in research fundamentals, gaining a better understanding of critical frameworks in research, and learning more about themselves and reflecting on their own privileges and biases. The CRET provides nursing students and nursing faculty with a tool for examining diversity and ultimately decreasing health disparity.

  3. Reconceptualizing vulnerability: deconstruction and reconstruction as a postmodern feminist analytical research method.

    PubMed

    Glass, Nel; Davis, Kierrynn

    2004-01-01

    Nursing research informed by postmodern feminist perspectives has prompted many debates in recent times. While this is so, nurse researchers who have been tempted to break new ground have had few examples of appropriate analytical methods for a research design informed by the above perspectives. This article presents a deconstructive/reconstructive secondary analysis of a postmodern feminist ethnography in order to provide an analytical exemplar. In doing so, previous notions of vulnerability as a negative state have been challenged and reconstructed.

  4. Innovations in scholarly communication - global survey on research tool usage

    PubMed Central

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents’ demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  5. Innovations in scholarly communication - global survey on research tool usage.

    PubMed

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents' demographics included research roles, country of affiliation, research discipline and year of first publication.

  6. Hair analysis, a novel tool in forensic and biomedical sciences: new chromatographic and electrophoretic/electrokinetic analytical strategies.

    PubMed

    Tagliaro, F; Smith, F P; De Battisti, Z; Manetto, G; Marigo, M

    1997-02-07

    Hair analysis for abused drugs is recognized as a powerful tool to investigate exposure of subjects to these substances. In fact, drugs permeate the hair matrix at the root level and above. Evidence of their presence remains incorporated into the hair stalk for the entire life of this structure. Most abusive drugs (e.g. opiates, cocaine, amphetamines, cannabinoids etc.) and several therapeutic drugs (e.g. antibiotics, theophylline, beta 2-agonists, etc.) have been demonstrated to be detectable in the hair of chronic users. Hence, hair analysis has been proposed to investigate drug abuses for epidemiological, clinical, administrative and forensic purposes, such as in questions of drug-related fatalities and revocation of driving licences, alleged drug addiction or drug abstinence in criminal or civil cases and for the follow-up of detoxication treatments. However, analytical and interpretative problems still remain and these limit the acceptance of this methodology, especially when the results from hair analysis represent a single piece of evidence and can not be supported by concurrent data. The present paper presents an updated review (with 102 references) of the modern techniques for hair analysis, including screening methods (e.g. immunoassays) and more sophisticated methodologies adopted for results confirmation and/or for research purposes, with special emphasis on gas chromatography-mass spectrometry, liquid chromatography and capillary electrophoresis.

  7. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  8. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  9. The use of metacognitive tools in a multidimensional research program

    NASA Astrophysics Data System (ADS)

    Iuli, Richard John

    Metacognition may be thought of as "cognition about cognition", or "thinking about thinking." A number of strategies and tools have been developed to help individuals understand the nature of knowledge, and to enhance their "thinking about thinking." Two metacognitive tools, concept maps and Gowin's Vee, were first developed for use in educational research. Subsequently, they were used successfully to help learners "learn how to learn." The success of metacognitive tools in educational settings suggests that they may help scientists understand the nature of knowledge production and organization, thereby facilitating their research activities and enhancing their understanding of the events and objects they study. In September 1993 I began an ethnographic, naturalistic study of the United States Department of Agriculture - Agricultural Research Service - Rhizobotany Project at Cornell University in Ithaca, NY. I spent the next two and one-half years as a participant observer with the Project. The focus of my research was to examine the application of metacognitive tools to an academic research setting. The knowledge claims that emerged from my research were: (1) Individual researchers tended to have narrow views of the Rhizobotany Project that centered on their individual areas of research; (2) The researchers worked in "conceptual isolation", or failing to see the connections and interrelatedness of their own work with the work of the others; (3) For those researchers who constructed concept maps and Vee diagrams, these heuristics helped them to build a deeper conceptual understanding of their own work; and (4) Half of the members of the research team did not find concept mapping and Vee diagramming useful. Their reluctance to use these tools was interpreted as an indication of epistemological confusion. The prevalence of conceptual isolation and epistemological confusion among members of the Rhizobotany Project parallels the results of previous studies that have

  10. Identifying and Tracing Persistent Identifiers of Research Resources : Automation, Metrics and Analytics

    NASA Astrophysics Data System (ADS)

    Maull, K. E.; Hart, D.; Mayernik, M. S.

    2015-12-01

    Formal and informal citations and acknowledgements for research infrastructures, such as data collections, software packages, and facilities, are an increasingly important function of attribution in scholarly literature. While such citations provide the appropriate links, even if informally, to their origins, they are often done so inconsistently, making such citations hard to analyze. While significant progress has been made in the past few years in the development of recommendations, policies, and procedures for creating and promoting citable identifiers, progress has been mixed in tracking how data sets and other digital infrastructures have actually been identified and cited in the literature. Understanding the full extent and value of research infrastructures through the lens of scholarly literature requires significant resources, and thus, we argue must rely on automated approaches that mine and track persistent identifiers to scientific resources. Such automated approaches, however, face a number of unique challenges, from the inconsistent and informal referencing practices of authors, to unavailable, embargoed or hard-to-obtain full-text resources for text analytics, to inconsistent and capricious impact metrics. This presentation will discuss work to develop and evaluate tools for automating the tracing of research resource identification and referencing in the research literature via persistent citable identifiers. Despite the impediments, automated processes are of considerable importance in enabling these traceability efforts to scale, as the numbers of identifiers being created for unique scientific resources continues to grow rapidly. Such efforts, if successful, should improve the ability to answer meaningful questions about research resources as they continue to grow as a target of advanced analyses in research metrics.

  11. Dataset-Driven Research to Support Learning and Knowledge Analytics

    ERIC Educational Resources Information Center

    Verbert, Katrien; Manouselis, Nikos; Drachsler, Hendrik; Duval, Erik

    2012-01-01

    In various research areas, the availability of open datasets is considered as key for research and application purposes. These datasets are used as benchmarks to develop new algorithms and to compare them to other algorithms in given settings. Finding such available datasets for experimentation can be a challenging task in technology enhanced…

  12. Dataset-Driven Research to Support Learning and Knowledge Analytics

    ERIC Educational Resources Information Center

    Verbert, Katrien; Manouselis, Nikos; Drachsler, Hendrik; Duval, Erik

    2012-01-01

    In various research areas, the availability of open datasets is considered as key for research and application purposes. These datasets are used as benchmarks to develop new algorithms and to compare them to other algorithms in given settings. Finding such available datasets for experimentation can be a challenging task in technology enhanced…

  13. Practitioner-Oriented Research as a Tool for Professional Development

    ERIC Educational Resources Information Center

    Johansson, Inge; Sandberg, Anette; Vuorinen, Tuula

    2007-01-01

    The aim of this study was to analyse how a model for practitioner-oriented research can be used as a tool for professional development in the preschool. The focus of interest is the type of knowledge that is formed when researchers and preschool staff cooperate on local projects, and what this new knowledge means for the images of professional…

  14. Building a Better Bibliography: Computer-Aided Research Tools.

    ERIC Educational Resources Information Center

    Bloomfield, Elizabeth

    1989-01-01

    Describes a project at the University of Guelph (Ontario) that combined both bibliographical and archival references in one large machine readable database to facilitate local history research. The description covers research tool creation, planning activities, system design, the database management system used, material selection, record…

  15. Focus Group Research: A Tool for the Student Affairs Professional.

    ERIC Educational Resources Information Center

    Jacobi, Maryann

    1991-01-01

    Explores limits of quantitative research methods and introduces qualitative approach, focus groups, as alternative information-collection tool for student personnel administrators. Presents two research projects where focus groups were used. Maintains that focus group approach has several advantages, including cost effectiveness, emphasis on…

  16. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  17. The WWW Cabinet of Curiosities: A Serendipitous Research Tool

    ERIC Educational Resources Information Center

    Arnold, Josie

    2012-01-01

    This paper proposes that the WWW is able to be fruitfully understood as a research tool when we utilise the metaphor of the cabinet of curiosities, the wunderkammer. It unpeels some of the research attributes of the metaphor as it reveals the multiplicity of connectivity on the web that provides serendipitous interactions between unexpected…

  18. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  19. Error budgets for quality management--practical tools for planning and assuring the analytical quality of laboratory testing processes.

    PubMed

    Westgard, J O

    1996-01-01

    Analytical quality is often assumed, rather than being assured or guaranteed. Given that it is still essential that laboratories produce reliable test results, managers must continue to improve their skills in analytical quality management. This paper shows managers how to use error budgets and charts of operating specifications (¿OPSpecs¿ charts) to select appropriate control rules and numbers of control measurements, taking into account the analytical or clinical quality required for a test and the imprecision and inaccuracy observed for a method. With currently available tools and a little practice, quality control (QC) procedures can be selected quickly and easily, in just 1 minute or less. Future technology is expected to automate the QC selection process and provide dynamic quality control.

  20. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  1. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  2. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  3. Analytics for Cyber Network Defense

    SciTech Connect

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  4. Finding collaborators: toward interactive discovery tools for research network systems.

    PubMed

    Borromeo, Charles D; Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-11-04

    Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the implications of collaborator search tools for researcher

  5. Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems

    PubMed Central

    Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-01-01

    Background Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. Objective The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Methods Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Results Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Conclusions Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the

  6. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  7. Single Subject Research: A Synthesis of Analytic Methods

    ERIC Educational Resources Information Center

    Alresheed, Fahad; Hott, Brittany L.; Bano, Carmen

    2013-01-01

    Historically, the synthesis of single subject design has employed visual inspection to yield significance of results. However, current research is supporting different techniques that will facilitate the interpretation of these intervention outcomes. These methods can provide more reliable data than employing visual inspection in isolation. This…

  8. The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop

    DTIC Science & Technology

    2005-08-11

    Metrics of Assessment (Working Group 3) The accompanying Excel workbook contains two worksheets . The first is a Tools versus Questions worksheet and the...emphasis on transnational actors. have similar missions with respect to cri - describing the success or failure to support Academics, US government...sheet tools, GIS; Microsoft Project show great promise "• Encourage MORS Sponsors to contact the various agencies to find out what tools and

  9. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    PubMed

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  10. Analytical and scale model research aimed at improved hangglider design

    NASA Technical Reports Server (NTRS)

    Kroo, I.; Chang, L. S.

    1979-01-01

    Research consisted of a theoretical analysis which attempts to predict aerodynamic characteristics using lifting surface theory and finite-element structural analysis as well as an experimental investigation using 1/5 scale elastically similar models in the NASA Ames 2m x 3m (7' x 10') wind tunnel. Experimental data were compared with theoretical results in the development of a computer program which may be used in the design and evaluation of ultralight gliders.

  11. Complex source beam: A tool to describe highly focused vector beams analytically

    SciTech Connect

    Orlov, S.; Peschel, U.

    2010-12-15

    The scalar-complex-source model is used to develop an accurate description of highly focused radially, azimuthally, linearly, and circularly polarized monochromatic vector beams. We investigate the power and full beam widths at half maximum of vigorous Maxwell equation solutions. The analytical expressions are employed to compare the vector complex source beams with the real beams produced by various high-numerical-aperture (NA) focusing systems. We find a parameter set for which the spatial extents of the analytical beams are the same as those of experimentally realized ones. We ensure the same shape of the considered beams by investigating an overlap of the complex source beams with high-NA beams. We demonstrate that the analytical expressions are good approximations for realistic highly focused beams.

  12. The efficacy of violence prediction: a meta-analytic comparison of nine risk assessment tools.

    PubMed

    Yang, Min; Wong, Stephen C P; Coid, Jeremy

    2010-09-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their predictive efficacies for violence. The effect sizes were extracted from 28 original reports published between 1999 and 2008, which assessed the predictive accuracy of more than one tool. We used a within-subject design to improve statistical power and multilevel regression models to disentangle random effects of variation between studies and tools and to adjust for study features. All 9 tools and their subscales predicted violence at about the same moderate level of predictive efficacy with the exception of Psychopathy Checklist--Revised (PCL-R) Factor 1, which predicted violence only at chance level among men. Approximately 25% of the total variance was due to differences between tools, whereas approximately 85% of heterogeneity between studies was explained by methodological features (age, length of follow-up, different types of violent outcome, sex, and sex-related interactions). Sex-differentiated efficacy was found for a small number of the tools. If the intention is only to predict future violence, then the 9 tools are essentially interchangeable; the selection of which tool to use in practice should depend on what other functions the tool can perform rather than on its efficacy in predicting violence. The moderate level of predictive accuracy of these tools suggests that they should not be used solely for some criminal justice decision making that requires a very high level of accuracy such as preventive detention.

  13. Sustainability Considerations for Health Research and Analytic Data Infrastructures

    PubMed Central

    Wilcox, Adam; Randhawa, Gurvaneet; Embi, Peter; Cao, Hui; Kuperman, Gilad J.

    2014-01-01

    Introduction: The United States has made recent large investments in creating data infrastructures to support the important goals of patient-centered outcomes research (PCOR) and comparative effectiveness research (CER), with still more investment planned. These initial investments, while critical to the creation of the infrastructures, are not expected to sustain them much beyond the initial development. To provide the maximum benefit, the infrastructures need to be sustained through innovative financing models while providing value to PCOR and CER researchers. Sustainability Factors: Based on our experience with creating flexible sustainability strategies (i.e., strategies that are adaptive to the different characteristics and opportunities of a resource or infrastructure), we define specific factors that are important considerations in developing a sustainability strategy. These factors include assets, expansion, complexity, and stakeholders. Each factor is described, with examples of how it is applied. These factors are dimensions of variation in different resources, to which a sustainability strategy should adapt. Summary Observations: We also identify specific important considerations for maintaining an infrastructure, so that the long-term intended benefits can be realized. These observations are presented as lessons learned, to be applied to other sustainability efforts. We define the lessons learned, relating them to the defined sustainability factors as interactions between factors. Conclusion and Next Steps: Using perspectives and experiences from a diverse group of experts, we define broad characteristics of sustainability strategies and important observations, which can vary for different projects. Other descriptions of adaptive, flexible, and successful models of collaboration between stakeholders and data infrastructures can expand this framework by identifying other factors for sustainability, and give more concrete directions on how sustainability

  14. Sustainability considerations for health research and analytic data infrastructures.

    PubMed

    Wilcox, Adam; Randhawa, Gurvaneet; Embi, Peter; Cao, Hui; Kuperman, Gilad J

    2014-01-01

    The United States has made recent large investments in creating data infrastructures to support the important goals of patient-centered outcomes research (PCOR) and comparative effectiveness research (CER), with still more investment planned. These initial investments, while critical to the creation of the infrastructures, are not expected to sustain them much beyond the initial development. To provide the maximum benefit, the infrastructures need to be sustained through innovative financing models while providing value to PCOR and CER researchers. Based on our experience with creating flexible sustainability strategies (i.e., strategies that are adaptive to the different characteristics and opportunities of a resource or infrastructure), we define specific factors that are important considerations in developing a sustainability strategy. These factors include assets, expansion, complexity, and stakeholders. Each factor is described, with examples of how it is applied. These factors are dimensions of variation in different resources, to which a sustainability strategy should adapt. We also identify specific important considerations for maintaining an infrastructure, so that the long-term intended benefits can be realized. These observations are presented as lessons learned, to be applied to other sustainability efforts. We define the lessons learned, relating them to the defined sustainability factors as interactions between factors. Using perspectives and experiences from a diverse group of experts, we define broad characteristics of sustainability strategies and important observations, which can vary for different projects. Other descriptions of adaptive, flexible, and successful models of collaboration between stakeholders and data infrastructures can expand this framework by identifying other factors for sustainability, and give more concrete directions on how sustainability can be best achieved.

  15. An Analytic Tool to Investigate the Effect of Binder on the Sensitivity of HMX-Based Plastic Bonded Explosives in the Skid Test

    SciTech Connect

    Hayden, D. W.

    2004-11-01

    This project will develop an analytical tool to calculate performance of HMX based PBXs in the skid test. The skid-test is used as a means to measure sensitivity for large charges in handling situations. Each series of skid tests requires dozens of drops of large billets. It is proposed that the reaction (or lack of one) of PBXs in the skid test is governed by the mechanical properties of the binder. If true, one might be able to develop an analytical tool to estimate skid test behavior for new PBX formulations. Others over the past 50 years have tried to develop similar models. This project will research and summarize the works of others and couple the work of 3 into an analytical tool that can be run on a PC to calculate drop height of HMX based PBXs. Detonation due to dropping a billet is argued to be a dynamic thermal event. To avoid detonation, the heat created due to friction at impact, must be conducted into the charge or the target faster than the chemical kinetics can create additional energy. The methodology will involve numerically solving the Frank-Kamenetskii equation in one dimension. The analytical problem needs to be bounded in terms of how much heat is introduced to the billet and for how long. Assuming an inelastic collision with no rebound, the billet will be in contact with the target for a short duration determined by the equations of motion. For the purposes of the calculations, it will be assumed that if a detonation is to occur, it will transpire within that time. The surface temperature will be raised according to the friction created using the equations of motion of dropping the billet on a rigid surface. The study will connect the works of Charles Anderson, Alan Randolph, Larry Hatler, Alfonse Popolato, and Charles Mader into a single PC based analytic tool. Anderson's equations of motion will be used to calculate the temperature rise upon impact, the time this temperature is maintained (contact time) will be obtained from the work of

  16. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  17. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  18. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  19. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  20. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  1. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2011-09-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  2. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2012-01-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  3. Research for research: tools for knowledge discovery and visualization.

    PubMed Central

    Van Mulligen, Erik M.; Van Der Eijk, Christiaan; Kors, Jan A.; Schijvenaars, Bob J. A.; Mons, Barend

    2002-01-01

    This paper describes a method to construct from a set of documents a spatial representation that can be used for information retrieval and knowledge discovery. The proposed method has been implemented in a prototype system and allows the researcher to browse interactively and in real-time a network of relationships obtained from a set of full text articles. These relationships are combined with the potential relationships between concepts as defined in the UMLS semantic network. The browser allows the user to select a seed term and find all related concepts, to find a path between concepts (hypothesis testing), and to retrieve the references to documents or database entries that support the relationship between concepts. PMID:12463942

  4. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  5. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  6. Tracking and Visualizing Student Effort: Evolution of a Practical Analytics Tool for Staff and Student Engagement

    ERIC Educational Resources Information Center

    Nagy, Robin

    2016-01-01

    There is an urgent need for our educational system to shift assessment regimes from a narrow, high-stakes focus on grades, to more holistic definitions that value the qualities that lifelong learners will need. The challenge for learning analytics in this context is to deliver actionable assessments of these hard-to-quantify qualities, valued by…

  7. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    in database and data warehousing, data mining and machine learning, risk analysis and optimization, as well as applied analytics. Practitioners...analyzing historical time series data to provide insights regarding future decisions. • Data mining – which involves mining transactional data bases...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of

  8. Informatics tools to improve clinical research study implementation.

    PubMed

    Brandt, Cynthia A; Argraves, Stephanie; Money, Roy; Ananth, Gowri; Trocky, Nina M; Nadkarni, Prakash M

    2006-04-01

    There are numerous potential sources of problems when performing complex clinical research trials. These issues are compounded when studies are multi-site and multiple personnel from different sites are responsible for varying actions from case report form design to primary data collection and data entry. We describe an approach that emphasizes the use of a variety of informatics tools that can facilitate study coordination, training, data checks and early identification and correction of faulty procedures and data problems. The paper focuses on informatics tools that can help in case report form design, procedures and training and data management. Informatics tools can be used to facilitate study coordination and implementation of clinical research trials.

  9. A Tool for Mapping Research Skills in Undergraduate Curricula

    ERIC Educational Resources Information Center

    Fraser, Gillian A.; Crook, Anne C.; Park, Julian R.

    2007-01-01

    There has been considerable interest recently in the teaching of skills to undergraduate students. However, existing methods for collating data on how much, where and when students are taught and assessed skills have often been shown to be time-consuming and ineffective. Here, we outline an electronic research skills audit tool that has been…

  10. Measurement and Research Tools. Symposium 37. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This symposium on measurement and research tools consists of three presentations. "An Examination of the Multiple Intelligences Developmental Assessment Scales (MIDAS)" (Albert Wiswell et al.) explores MIDAS's psychometric saliency. Findings indicates this instrument represents an incomplete attempt to develop a valid assessment of…

  11. Research Tools and Materials | NCI Technology Transfer Center | TTC

    Cancer.gov

    Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies. | [google6f4cd5334ac394ab.html

  12. Software Tools | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  13. Analyzing Online Teacher Networks: Cyber Networks Require Cyber Research Tools

    ERIC Educational Resources Information Center

    Schlager, Mark S.; Farooq, Umer; Fusco, Judith; Schank, Patricia; Dwyer, Nathan

    2009-01-01

    The authors argue that conceptual and methodological limitations in existing research approaches severely hamper theory building and empirical exploration of teacher learning and collaboration through cyber-enabled networks. They conclude that new frameworks, tools, and techniques are needed to understand and maximize the benefits of teacher…

  14. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  15. Facilitating Metacognitive Talk: A Research and Learning Tool

    ERIC Educational Resources Information Center

    Wall, Kate; Higgins, Steve

    2006-01-01

    This paper describes a research tool which aims to gather data about pupils' views of learning and teaching, with a particular focus on their thinking about their learning (metacognition). The approach has proved to be an adaptable and effective technique to examine different learning contexts from the pupils' perspective, while also acting as an…

  16. Narrative Inquiry: Research Tool and Medium for Professional Development.

    ERIC Educational Resources Information Center

    Conle, Carola

    2000-01-01

    Describes the development of narrative inquiry, highlighting one institutional setting, and discussing how narrative inquiry moved from being a research tool to a vehicle for curriculum within both graduate and preservice teacher development. After discussing theoretical resources for narrative inquiry, the paper examines criteria and terms…

  17. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  18. A Tool for Mapping Research Skills in Undergraduate Curricula

    ERIC Educational Resources Information Center

    Fraser, Gillian A.; Crook, Anne C.; Park, Julian R.

    2007-01-01

    There has been considerable interest recently in the teaching of skills to undergraduate students. However, existing methods for collating data on how much, where and when students are taught and assessed skills have often been shown to be time-consuming and ineffective. Here, we outline an electronic research skills audit tool that has been…

  19. Analyzing Online Teacher Networks: Cyber Networks Require Cyber Research Tools

    ERIC Educational Resources Information Center

    Schlager, Mark S.; Farooq, Umer; Fusco, Judith; Schank, Patricia; Dwyer, Nathan

    2009-01-01

    The authors argue that conceptual and methodological limitations in existing research approaches severely hamper theory building and empirical exploration of teacher learning and collaboration through cyber-enabled networks. They conclude that new frameworks, tools, and techniques are needed to understand and maximize the benefits of teacher…

  20. The DISTANCE model for collaborative research: distributing analytic effort using scrambled data sets.

    PubMed

    Moffet, Howard H; Warton, E Margaret; Parker, Melissa M; Liu, Jennifer Y; Lyles, Courtney R; Karter, Andrew J

    Data-sharing is encouraged to fulfill the ethical responsibility to transform research data into public health knowledge, but data sharing carries risks of improper disclosure and potential harm from release of individually identifiable data. The study objective was to develop and implement a novel method for scientific collaboration and data sharing which distributes the analytic burden while protecting patient privacy. A procedure was developed where in an investigator who is external to an analytic coordinating center (ACC) can conduct original research following a protocol governed by a Publications and Presentations (P&P) Committee. The collaborating investigator submits a study proposal and, if approved, develops the analytic specifications using existing data dictionaries and templates. An original data set is prepared according to the specifications and the external investigator is provided with a complete but de-identified and shuffled data set which retains all key data fields but which obfuscates individually identifiable data and patterns; this" scrambled data set" provides a "sandbox" for the external investigator to develop and test analytic code for analyses. The analytic code is then run against the original data at the ACC to generate output which is used by the external investigator in preparing a manuscript for journal submission. The method has been successfully used with collaborators to produce many published papers and conference reports. By distributing the analytic burden, this method can facilitate collaboration and expand analytic capacity, resulting in more science for less money.

  1. Adsorptive micro-extraction techniques--novel analytical tools for trace levels of polar solutes in aqueous media.

    PubMed

    Neng, N R; Silva, A R M; Nogueira, J M F

    2010-11-19

    A novel enrichment technique, adsorptive μ-extraction (AμE), is proposed for trace analysis of polar solutes in aqueous media. The preparation, stability tests and development of the analytical devices using two geometrical configurations, i.e. bar adsorptive μ-extraction (BAμE) and multi-spheres adsorptive μ-extraction (MSAμE) is fully discussed. From the several sorbent materials tested, activated carbons and polystyrene divinylbenzene phases demonstrated the best stability, robustness and to be the most suitable for analytical purposes. The application of both BAμE and MSAμE devices proved remarkable performance for the determination of trace levels of polar solutes and metabolites (e.g. pesticides, disinfection by-products, drugs of abuse and pharmaceuticals) in water matrices and biological fluids. By comparing AμE techniques with stir bar sorptive extraction based on polydimethylsiloxane phase, great effectiveness is attained overcoming the limitations of the latter enrichment approach regarding the more polar solutes. Furthermore, convenient sensitivity and selectivity is reached through AμE techniques, since the great advantage of this new analytical technology is the possibility to choose the most suitable sorbent to each particular type of application. The enrichment techniques proposed are cost-effective, easy to prepare and work-up, demonstrating robustness and to be a remarkable analytical tool for trace analysis of priority solutes in areas of recognized importance such as environment, forensic and other related life sciences.

  2. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  3. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    PubMed

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  4. Redundancy in neutron activation analysis: A valuable tool in assuring analytical quality

    SciTech Connect

    Greenberg, R.R.

    1996-12-31

    Neutron activation analysis (NAA) has become widely used and is extremely valuable for the certification of standard reference materials (SRMs) at the National Institute of Standards and Technology (NIST). This is due to a number of reasons. First, NAA has essentially no significant sources of error in common with the other analytical techniques used at NIST to measure inorganic concentrations. This is important because most certified elemental concentrations are derived from the data determined by two (and occasionally more) independent analytical techniques. Two or more techniques are used for SRM certification because, although each technique has previously been evaluated and shown to be accurate, unexpected problems can arise, especially when analyzing new matrices. Another reason for the use of NAA for SRM certification is the potential of this technique for accuracy. The SRM measurements with estimated accuracies of 1 to 2% (at essentially 95% confidence intervals) are routinely made at NIST using NAA.

  5. DNA-only cascade: a universal tool for signal amplification, enhancing the detection of target analytes.

    PubMed

    Bone, Simon M; Hasick, Nicole J; Lima, Nicole E; Erskine, Simon M; Mokany, Elisa; Todd, Alison V

    2014-09-16

    Diagnostic tests performed in the field or at the site of patient care would benefit from using a combination of inexpensive, stable chemical reagents and simple instrumentation. Here, we have developed a universal "DNA-only Cascade" (DoC) to quantitatively detect target analytes with increased speed. The DoC utilizes quasi-circular structures consisting of temporarily inactivated deoxyribozymes (DNAzymes). The catalytic activity of the DNAzymes is restored in a universal manner in response to a broad range of environmental and biological targets. The present study demonstrates DNAzyme activation in the presence of metal ions (Pb(2+)), small molecules (deoxyadenosine triphosphate) and nucleic acids homologous to genes from Meningitis-causing bacteria. Furthermore, DoC efficiently discriminates nucleic acid targets differing by a single nucleotide. When detection of analytes is orchestrated by functional nucleic acids, the inclusion of DoC reagents substantially decreases time for detection and allows analyte quantification. The detection of nucleic acids using DoC was further characterized for its capability to be multiplexed and retain its functionality following long-term exposure to ambient temperatures and in a background of complex medium (human serum).

  6. Benchmarking biology research organizations using a new, dedicated tool.

    PubMed

    van Harten, Willem H; van Bokhorst, Leonard; van Luenen, Henri G A M

    2010-02-01

    International competition forces fundamental research organizations to assess their relative performance. We present a benchmark tool for scientific research organizations where, contrary to existing models, the group leader is placed in a central position within the organization. We used it in a pilot benchmark study involving six research institutions. Our study shows that data collection and data comparison based on this new tool can be achieved. It proved possible to compare relative performance and organizational characteristics and to generate suggestions for improvement for most participants. However, strict definitions of the parameters used for the benchmark and a thorough insight into the organization of each of the benchmark partners is required to produce comparable data and draw firm conclusions.

  7. Echocardiography as a Research and Clinical Tool in Veterinary Medicine

    PubMed Central

    Allen, D. G.

    1982-01-01

    Echocardiography is the accepted term for the study of cardiac ultrasound. Although a relatively new tool for the study of the heart in man it has already found wide acceptance in the area of cardiac research and in the study of clinical cardiac disease. Animals had often been used in the early experiments with cardiac ultrasound, but only recently has echocardiography been used as a research and clinical tool in veterinary medicine. In this report echocardiography is used in the research of anesthetic effects on ventricular function and clinically in the diagnosis of congestive cardiomyopathy in a cat, ventricular septal defect in a calf, and pericardial effusion in a dog. Echocardiography is now an important adjunct to the field of veterinary cardiology. ImagesFigure 7.Figure 8.Figure 9.Figure 10. PMID:17422196

  8. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  9. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  10. Technical phosphoproteomic and bioinformatic tools useful in cancer research

    PubMed Central

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  11. BALLView: a tool for research and education in molecular modeling.

    PubMed

    Moll, Andreas; Hildebrandt, Andreas; Lenhof, Hans-Peter; Kohlbacher, Oliver

    2006-02-01

    We present BALLView, a molecular viewer and modeling tool. It combines state-of-the-art visualization capabilities with powerful modeling functionality including implementations of force field methods and continuum electrostatics models. BALLView is a versatile and extensible tool for research in structural bioinformatics and molecular modeling. Furthermore, the convenient and intuitive graphical user interface offers novice users direct access to the full functionality, rendering it ideal for teaching. Through an interface to the object-oriented scripting language Python it is easily extensible.

  12. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  13. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants.

    PubMed

    Cozzolino, Daniel

    2015-07-02

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants.

  14. The use of Permeation Liquid Membrane (PLM) as an analytical tool for trace metal speciation studies in natural waters

    NASA Astrophysics Data System (ADS)

    Parthasarathy, N.; Pelletier, M.; Buffle, J.

    2003-05-01

    Permeation liquid membrane (PLM) based on liquid-liquid extraction principles is an emerging analytical tool for making in situ trace metal speciation measurements. A PLM comprising didecyl 1, 10 diaza crown etherlauric acid in phenylhexane/toluene has been developed for measuring free metal ions (e.g. Cu, Pb, Cd and Zn) concentration under natural water conditions. The capability of PLM for making speciation studies has been demonstrated using synthetic and natural ligands. Application of in situ preconcentration of trace metals in diverse waters using specially designed hollow fibre PLM are reported.

  15. Analytical tool for risk assessment of landscape and urban planning: Spatial development impact assessment

    NASA Astrophysics Data System (ADS)

    Rehak, David; Senovsky, Michail; Balog, Karol; Dvorak, Jiri

    2011-06-01

    This article covers the issue of preventive protection of population, technical infrastructure, and the environment against adverse impacts of careless spatial development. In the first section, we describe the relationship between sustainable development and spatial development. This discussion is followed by a review of the current state of spatial development security, primarily at a national level in the Czech Republic. The remainder of the paper features our original contribution which is a tool for risk assessment in landscape and urban planning, the Spatial Development Impact Assessment (SDIA) tool. We briefly review the most significant semi-quantitative methods of risk analysis that were used as a starting point in implementing the tool, and we discuss several of SDIA's salient features, namely, the assessment process algorithm, the catalogue of hazard and asset groups, and the spatial development impact matrix.

  16. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  17. The use of analytical surface tools in the fundamental study of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    This paper reviews the various techniques and surface tools available for the study of the atomic nature of the wear of materials. These include chemical etching, X-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which effect wear such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  18. Petasites hybridus: a tool for interdisciplinary research in phytotherapy.

    PubMed

    Debrunner, B; Meier, B

    1998-02-01

    The 3rd Petasites gathering took place in Romanshorn, Switzerland on March 29, 1996 and gave 16 European scientists the opportunity to transmit their latest considerable discoveries to interested researchers working in different scientific disciplines such as pharmacognosy, botany, chemistry, pharmacology, medicine or clinical pharmacy. The newest findings on Petasites hybridus as a significant plant drug showed very promising aspects of therapeutic utility. Great progress has been made in chemical analytical methods and the determination of pharmacological activities. Substantial advances have also occurred in the production of bioassay procedures and plant materials, particularly utilizing cell- and tissue-culture techniques.

  19. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development.

  20. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  1. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  2. Microfluidics as a tool for C. elegans research.

    PubMed

    San-Miguel, Adriana; Lu, Hang

    2013-09-24

    Microfluidics has emerged as a set of powerful tools that have greatly advanced some areas of biological research, including research using C. elegans. The use of microfluidics has enabled many experiments that are otherwise impossible with conventional methods. Today there are many examples that demonstrate the main advantages of using microfluidics for C. elegans research, achieving precise environmental conditions and facilitating worm handling. Examples range from behavioral analysis under precise chemical or odor stimulation, locomotion studies in well-defined structural surroundings, and even long-term culture on chip. Moreover, microfluidics has enabled coupling worm handling and imaging thus facilitating genetic screens, optogenetic studies, and laser ablation experiments. In this article, we review some of the applications of microfluidics for C. elegans research and provide guides for the design, fabrication, and use of microfluidic devices for C. elegans research studies.

  3. ICL-Based OF-CEAS: A Sensitive Tool for Analytical Chemistry.

    PubMed

    Manfred, Katherine M; Hunter, Katharine M; Ciaffoni, Luca; Ritchie, Grant A D

    2017-01-03

    Optical-feedback cavity-enhanced absorption spectroscopy (OF-CEAS) using mid-infrared interband cascade lasers (ICLs) is a sensitive technique for trace gas sensing. The setup of a V-shaped optical cavity operating with a 3.29 μm cw ICL is detailed, and a quantitative characterization of the injection efficiency, locking stability, mode matching, and detection sensitivity is presented. The experimental data are supported by a model to show how optical feedback affects the laser frequency as it is scanned across several longitudinal modes of the optical cavity. The model predicts that feedback enhancement effects under strongly absorbing conditions can cause underestimations in the measured absorption, and these predictions are verified experimentally. The technique is then used in application to the detection of nitrous oxide as an exemplar of the utility of this technique for analytical gas phase spectroscopy. The analytical performance of the spectrometer, expressed as noise equivalent absorption coefficient, was estimated as 4.9 × 10(-9) cm (-1) Hz(-1/2), which compares well with recently reported values.

  4. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  5. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    PubMed

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  6. [Study monitoring: a useful tool for quality health research].

    PubMed

    Arias Valencia, Samuel Andrés; Hernández Pinzón, Giovanna

    2009-05-01

    As well as protecting the rights of participants, a study's ethics must encompass the quality of its execution. As such, international standards have been established for studies involving human subjects. The objective of this review is to evaluate the usefulness of the Guide to Good Clinical Practice and "study monitoring" as tools useful to producing quality research. The Guide provides scientific ethics and quality standards for designing, conducting, registering, and notifying studies involving human subjects. By implementing specific processes and procedures, study monitoring seeks to ensure that research is followed and evaluated from inception, through execution and closure, thus producing studies with high quality standards.

  7. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  8. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  9. FOSS Tools for Research Infrastructures - A Success Story?

    NASA Astrophysics Data System (ADS)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  10. Analytical aerodynamic model of a high alpha research vehicle wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Cao, Jichang; Garrett, Frederick, Jr.; Hoffman, Eric; Stalford, Harold

    1990-01-01

    A 6 DOF analytical aerodynamic model of a high alpha research vehicle is derived. The derivation is based on wind-tunnel model data valid in the altitude-Mach flight envelope centered at 15,000 ft altitude and 0.6 Mach number with Mach range between 0.3 and 0.9. The analytical models of the aerodynamics coefficients are nonlinear functions of alpha with all control variable and other states fixed. Interpolation is required between the parameterized nonlinear functions. The lift and pitching moment coefficients have unsteady flow parts due to the time range of change of angle-of-attack (alpha dot). The analytical models are plotted and compared with their corresponding wind-tunnel data. Piloted simulated maneuvers of the wind-tunnel model are used to evaluate the analytical model. The maneuvers considered are pitch-ups, 360 degree loaded and unloaded rolls, turn reversals, split S's, and level turns. The evaluation finds that (1) the analytical model is a good representation at Mach 0.6, (2) the longitudinal part is good for the Mach range 0.3 to 0.9, and (3) the lateral part is good for Mach numbers between 0.6 and 0.9. The computer simulations show that the storage requirement of the analytical model is about one tenth that of the wind-tunnel model and it runs twice as fast.

  11. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Dörnbrack, A.

    2012-04-01

    We present a web service based tool for the planning of atmospheric research flights. The tool, which we call the "Mission Support System" (MSS), provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. The MSS is focused on the primary needs of mission scientists responsible for planning a research flight, addressing in particular the following requirements: (1) interactive exploration of available atmospheric forecasts, (2) interactive flight planning in relation to these forecasts, (3) computation of expected flight performance to assess the technical feasibility (in terms of total distance and vertical profile) of a flight, (4) no transfer of large forecast data files to the campaign site to allow deployment at remote locations and (5) low demand on hardware resources. We have implemented the software using the open-source programming language Python.

  12. The Mosquito Online Advanced Analytic Service: a case study for school research projects in Thailand.

    PubMed

    Wongkoon, Siriwan; Jaroensutasinee, Mullica; Jaroensutasinee, Krisanadej

    2013-07-04

    The Mosquito Online Advanced Analytic Service (MOAAS) provides an essential tool for querying, analyzing, and visualizing patterns of mosquito larval distribution in Thailand. The MOAAS was developed using Structured Query Language (SQL) technology as a web-based tool for data entry and data access, webMathematica technology for data analysis and data visualization, and Google Earth and Google Maps for Geographic Information System (GIS) visualization. Fifteen selected schools in Thailand provided test data for MOAAS. Users performed data entry using the web-service, data analysis, and data visualization tools with webMathematica, data visualization with bar charts, mosquito larval indices, and three-dimensional (3D) bar charts overlaying on the Google Earth and Google Maps. The 3D bar charts of the number of mosquito larvae were displayed along with spatial information. The mosquito larvae information may be useful for dengue control efforts and health service communities for planning and operational activities.

  13. Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.

    PubMed

    Cartwright, Hugh M

    2008-01-01

    Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.

  14. Mineotaur: a tool for high-content microscopy screen sharing and visual analytics.

    PubMed

    Antal, Bálint; Chessel, Anatole; Carazo Salas, Rafael E

    2015-12-17

    High-throughput/high-content microscopy-based screens are powerful tools for functional genomics, yielding intracellular information down to the level of single-cells for thousands of genotypic conditions. However, accessing their data requires specialized knowledge and most often that data is no longer analyzed after initial publication. We describe Mineotaur ( http://www.mineotaur.org ), a open-source, downloadable web application that allows easy online sharing and interactive visualisation of large screen datasets, facilitating their dissemination and further analysis, and enhancing their impact.

  15. Web-based analytical tools for the exploration of spatial data

    NASA Astrophysics Data System (ADS)

    Anselin, Luc; Kim, Yong Wook; Syabri, Ibnu

    This paper deals with the extension of internet-based geographic information systems with functionality for exploratory spatial data analysis (esda). The specific focus is on methods to identify and visualize outliers in maps for rates or proportions. Three sets of methods are included: extreme value maps, smoothed rate maps and the Moran scatterplot. The implementation is carried out by means of a collection of Java classes to extend the Geotools open source mapping software toolkit. The web based spatial analysis tools are illustrated with applications to the study of homicide rates and cancer rates in U.S. counties.

  16. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  17. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  18. The Effects of Incentives on Workplace Performance: A Meta-Analytic Review of Research Studies

    ERIC Educational Resources Information Center

    Condly, Steven J.; Clark, Richard E.; Stolovitch, Harold D.

    2003-01-01

    A meta-analytic review of all adequately designed field and laboratory research on the use of incentives to motivate performance is reported. Of approximately 600 studies, 45 qualified. The overall average effect of all incentive programs in all work settings and on all work tasks was a 22% gain in performance. Team-directed incentives had a…

  19. The Nature and Effects of Transformational School Leadership: A Meta-Analytic Review of Unpublished Research

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Sun, Jingping

    2012-01-01

    Background: Using meta-analytic review techniques, this study synthesized the results of 79 unpublished studies about the nature of transformational school leadership (TSL) and its impact on the school organization, teachers, and students. This corpus of research associates TSL with 11 specific leadership practices. These practices, as a whole,…

  20. The Effects of Incentives on Workplace Performance: A Meta-Analytic Review of Research Studies

    ERIC Educational Resources Information Center

    Condly, Steven J.; Clark, Richard E.; Stolovitch, Harold D.

    2003-01-01

    A meta-analytic review of all adequately designed field and laboratory research on the use of incentives to motivate performance is reported. Of approximately 600 studies, 45 qualified. The overall average effect of all incentive programs in all work settings and on all work tasks was a 22% gain in performance. Team-directed incentives had a…

  1. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    ERIC Educational Resources Information Center

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  2. Sharing Data and Analytical Resources Securely in a Biomedical Research Grid Environment

    PubMed Central

    Langella, Stephen; Hastings, Shannon; Oster, Scott; Pan, Tony; Sharma, Ashish; Permar, Justin; Ervin, David; Cambazoglu, B. Barla; Kurc, Tahsin; Saltz, Joel

    2008-01-01

    Objectives To develop a security infrastructure to support controlled and secure access to data and analytical resources in a biomedical research Grid environment, while facilitating resource sharing among collaborators. Design A Grid security infrastructure, called Grid Authentication and Authorization with Reliably Distributed Services (GAARDS), is developed as a key architecture component of the NCI-funded cancer Biomedical Informatics Grid (caBIG™). The GAARDS is designed to support in a distributed environment 1) efficient provisioning and federation of user identities and credentials; 2) group-based access control support with which resource providers can enforce policies based on community accepted groups and local groups; and 3) management of a trust fabric so that policies can be enforced based on required levels of assurance. Measurements GAARDS is implemented as a suite of Grid services and administrative tools. It provides three core services: Dorian for management and federation of user identities, Grid Trust Service for maintaining and provisioning a federated trust fabric within the Grid environment, and Grid Grouper for enforcing authorization policies based on both local and Grid-level groups. Results The GAARDS infrastructure is available as a stand-alone system and as a component of the caGrid infrastructure. More information about GAARDS can be accessed at http://www.cagrid.org. Conclusions GAARDS provides a comprehensive system to address the security challenges associated with environments in which resources may be located at different sites, requests to access the resources may cross institutional boundaries, and user credentials are created, managed, revoked dynamically in a de-centralized manner. PMID:18308979

  3. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  4. SlicerAstro: A 3-D interactive visual analytics tool for HI data

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Fillion-Robin, J. C.; Yu, L.

    2017-04-01

    SKA precursors are capable of detecting hundreds of galaxies in HI in a single 12 h pointing. In deeper surveys one will probe more easily faint HI structures, typically located in the vicinity of galaxies, such as tails, filaments, and extraplanar gas. The importance of interactive visualization in data exploration has been demonstrated by the wide use of tools (e.g. Karma, Casaviewer, VISIONS) that help users to receive immediate feedback when manipulating the data. We have developed SlicerAstro, a 3-D interactive viewer with new analysis capabilities, based on traditional 2-D input/output hardware. These capabilities enhance the data inspection, allowing faster analysis of complex sources than with traditional tools. SlicerAstro is an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing. We demonstrate the capabilities of the current stable binary release of SlicerAstro, which offers the following features: (i) handling of FITS files and astronomical coordinate systems; (ii) coupled 2-D/3-D visualization; (iii) interactive filtering; (iv) interactive 3-D masking; (v) and interactive 3-D modeling. In addition, SlicerAstro has been designed with a strong, stable and modular C++ core, and its classes are also accessible via Python scripting, allowing great flexibility for user-customized visualization and analysis tasks.

  5. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    NASA Astrophysics Data System (ADS)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  6. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  7. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  8. [EpiInfo as a research and teaching tool in epidemiology and statistics: strengths and weaknesses].

    PubMed

    Mannocci, Alice; Bontempi, Claudio; Giraldi, Guglielmo; Chiaradia, Giacomina; de Waure, Chiara; Sferrazza, Antonella; Ricciardi, Walter; Boccia, Antonio; La Torre, Giuseppe

    2012-01-01

    EpiInfo is a free software developed in 1988 by the Centers for Disease Control and Prevention (CDC) in Atlanta to facilitate field epidemiological investigations and statistical analysis. The aim of this study was to assess whether the software represents, in the Italian biomedical field, an effective analytical research tool and a practical and simple epidemiology and biostatistics teaching tool. A questionnaire consisting of 20 multiple-choice and open questions was administered to 300 healthcare workers, including doctors, biologists, nurses, medical students and interns, at the end of a CME course in epidemiology and biostatistics. Sixty-four percent of participants were aged between 26 and 45 years, 52% were women and 73% were unmarried. Results show that women are more likely to utilize EpiInfo in their research activities with respect to men (p = 0.023), as are individuals aged 26-45 years with respect to the older and younger age groups (p = 0.023) and unmarried participants with respect to those married (p = 0.010). Thirty-one percent of respondents consider EpiInfo to be more than adequate for analysis of their research data and 52% consider it to be sufficiently so. The inclusion of an EpiInfo course in statistics and epidemiology modules facilitates the understanding of theoretical concepts and allows researchers to more easily perform some of the clinical/epidemiological research activities.

  9. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools

    PubMed Central

    Łojewska, J.; Rabin, I.; Pawcenis, D.; Bagniuk, J.; Aksamit-Koperska, M. A.; Sitarz, M.; Missori, M.; Krutzsch, M.

    2017-01-01

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods. PMID:28382971

  10. Tools for the Quantitative Analysis of Sedimentation Boundaries Detected by Fluorescence Optical Analytical Ultracentrifugation

    PubMed Central

    Zhao, Huaying; Casillas, Ernesto; Shroff, Hari; Patterson, George H.; Schuck, Peter

    2013-01-01

    Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system. PMID:24204779

  11. Molecularly imprinted polymers: an analytical tool for the determination of benzimidazole compounds in water samples.

    PubMed

    Cacho, Carmen; Turiel, Esther; Pérez-Conde, Concepción

    2009-05-15

    Molecularly imprinted polymers (MIPs) for benzimidazole compounds have been synthesized by precipitation polymerization using thiabendazole (TBZ) as template, methacrylic acid as functional monomer, ethyleneglycol dimethacrylate (EDMA) and divinylbenzene (DVB) as cross-linkers and a mixture of acetonitrile and toluene as porogen. The experiments carried out by molecularly imprinted solid phase extraction (MISPE) in cartridges demonstrated the imprint effect in both imprinted polymers. MIP-DVB enabled a much higher breakthrough volume than MIP-EDMA, and thus was selected for further experiments. The ability of this MIP for the selective recognition of other benzimidazole compounds (albendazole, benomyl, carbendazim, fenbendazole, flubendazole and fuberidazole) was evaluated. The obtained results revealed the high selectivity of the imprinted polymer towards all the selected benzimidazole compounds. An off-line analytical methodology based on a MISPE procedure has been developed for the determination of benzimidazole compounds in tap, river and well water samples at concentration levels below the legislated maximum concentration levels (MCLs) with quantitative recoveries. Additionally, an on-line preconcentration procedure based on the use of a molecularly imprinted polymer as selective stationary phase in HPLC is proposed as a fast screening method for the evaluation of the presence of benzimidazole compounds in water samples.

  12. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    PubMed

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  13. Analytical tools employed to determine pharmaceutical compounds in wastewaters after application of advanced oxidation processes.

    PubMed

    Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan

    2016-12-01

    Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.

  14. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools.

    PubMed

    Łojewska, J; Rabin, I; Pawcenis, D; Bagniuk, J; Aksamit-Koperska, M A; Sitarz, M; Missori, M; Krutzsch, M

    2017-04-06

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods.

  15. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools

    NASA Astrophysics Data System (ADS)

    Łojewska, J.; Rabin, I.; Pawcenis, D.; Bagniuk, J.; Aksamit-Koperska, M. A.; Sitarz, M.; Missori, M.; Krutzsch, M.

    2017-04-01

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods.

  16. Comparison of analytical tools and biological assays for detection of paralytic shellfish poisoning toxins.

    PubMed

    Humpage, A R; Magalhaes, V F; Froscio, S M

    2010-07-01

    The paralytic shellfish poisoning toxins (PSTs) were, as their name suggests, discovered as a result of human poisoning after consumption of contaminated shellfish. More recently, however, the same toxins have been found to be produced by freshwater cyanobacteria. These organisms have worldwide distribution and are common in our sources of drinking water, thus presenting another route of potential human exposure. However, the regulatory limits for PSTs in drinking water are considerably lower than in shellfish. This has increased the need to find alternatives to the mouse bioassay, which, apart from being ethically questionable, does not have a limit of detection capable of detecting the PSTs in water at the regulated concentrations. Additionally, the number of naturally occurring PSTs has grown substantially since saxitoxin was first characterised, markedly increasing the analytical challenge of this group of compounds. This paper summarises the development of chromatographic, toxicity, and molecular sensor binding methodologies for detection of the PSTs in shellfish, cyanobacteria, and water contaminated by these toxins. It then summarises the advantages and disadvantages of their use for particular applications. Finally it recommends some future requirements that will contribute to their improvement for these applications.

  17. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  18. Ethics: the risk management tool in clinical research.

    PubMed

    Wadlund, Jill; Platt, Leslie A

    2002-01-01

    Scientific discovery and knowledge expansion in the post genome era holds great promise for new medical technologies and cellular-based therapies with multiple applications that will save and enhance lives. While human beings long have hoped to unlock the mysteries of the molecular basis of life; our society is now on the verge of doing so. But new scientific and technological breakthroughs often come with some risks attached. Research--especially clinical trials and research involving human participants--must be conducted in accordance with the highest ethical and scientific principles. Yet, as the number and complexity of clinical trials increase, so do pressures for new revenue sources and shorter product development cycles, which could have an adverse impact on patient safety. This article explores the use of risk management tools in clinical research.

  19. Puzzle test: A tool for non-analytical clinical reasoning assessment

    PubMed Central

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test’s format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning. PMID:28210603

  20. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    PubMed

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  1. Scientific research tools as an aid to Antarctic logistics

    NASA Astrophysics Data System (ADS)

    Dinn, Michael; Rose, Mike; Smith, Andrew; Fleming, Andrew; Garrod, Simon

    2013-04-01

    Logistics have always been a vital part of polar exploration and research. The more efficient those logistics can be made, the greater the likelihood that research programmes will be delivered on time, safely and to maximum scientific effectiveness. Over the last decade, the potential for symbiosis between logistics and some of the scientific research methods themselves, has increased remarkably; suites of scientific tools can help to optimise logistic efforts, thereby enhancing the effectiveness of further scientific activity. We present one recent example of input to logistics from scientific activities, in support of the NERC iSTAR Programme, a major ice sheet research effort in West Antarctica. We used data output from a number of research tools, spanning a range of techniques and international agencies, to support the deployment of a tractor-traverse system into a remote area of mainland Antarctica. The tractor system was deployed from RRS Ernest Shackleton onto the Abbot Ice Shelf then driven inland to the research area in Pine Island Glacier Data from NASA ICEBRIDGE were used to determine the ice-front freeboard and surface gradients for the traverse route off the ice shelf and onwards into the continent. Quickbird high resolution satellite imagery provided clear images of route track and some insight into snow surface roughness. Polarview satellite data gave sea ice information in the Amundsen Sea, both the previous multi-annual historical characteristics and for real-time information during deployment. Likewise meteorological data contributed historical and information and was used during deployment. Finally, during the tractors' inland journey, ground-based high frequency radar was used to determine a safe, crevasse-free route.

  2. The Research Tools of the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Berriman, G. B.; Lazio, T. J.; Project, VAO

    2013-01-01

    Astronomy is being transformed by the vast quantities of data, models, and simulations that are becoming available to astronomers at an ever-accelerating rate. The U.S. Virtual Astronomical Observatory (VAO) has been funded to provide an operational facility that is intended to be a resource for discovery and access of data, and to provide science services that use these data. Over the course of the past year, the VAO has been developing and releasing for community use five science tools: 1) "Iris", for dynamically building and analyzing spectral energy distributions, 2) a web-based data discovery tool that allows astronomers to identify and retrieve catalog, image, and spectral data on sources of interest, 3) a scalable cross-comparison service that allows astronomers to conduct pair-wise positional matches between very large catalogs stored remotely as well as between remote and local catalogs, 4) time series tools that allow astronomers to compute periodograms of the public data held at the NASA Star and Exoplanet Database (NStED) and the Harvard Time Series Center, and 5) A VO-aware release of the Image Reduction and Analysis Facility (IRAF) that provides transparent access to VO-available data collections and is SAMP-enabled, so that IRAF users can easily use tools such as Aladin and Topcat in conjuction with IRAF tasks. Additional VAO services will be built to make it easy for researchers to provide access to their data in VO-compliant ways, to build VO-enabled custom applications in Python, and to respond generally to the growing size and complexity of astronomy data. Acknowledgements: The Virtual Astronomical Observatory (VAO) is managed by the VAO, LLC, a non-profit company established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. The VAO is sponsored by the National Science Foundation and the National Aeronautics and Space Administration.

  3. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research

    PubMed Central

    Torous, John; Kiang, Mathew V; Lorme, Jeanette

    2016-01-01

    Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Objective Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. Methods We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. Results We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Conclusions Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health. PMID:27150677

  4. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research.

    PubMed

    Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka

    2016-05-05

    A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.

  5. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  6. Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools

    ERIC Educational Resources Information Center

    García, Olga Arranz; Secades, Vidal Alonso

    2013-01-01

    In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…

  7. Critical Race Theory and Interest Convergence as Analytic Tools in Teacher Education Policies and Practices

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV

    2008-01-01

    In "The Report of the AERA Panel on Research and Teacher Education," Cochran-Smith and Zeichner's (2005) review of studies in the field of teacher education revealed that many studies lacked theoretical and conceptual grounding. The author argues that Derrick Bell's (1980) interest convergence, a principle of critical race theory, can be used as…

  8. Designing and implementing full immersion simulation as a research tool.

    PubMed

    Munroe, Belinda; Buckley, Thomas; Curtis, Kate; Morris, Richard

    2016-05-01

    Simulation is a valuable research tool used to evaluate the clinical performance of devices, people and systems. The simulated setting may address concerns unique to complex clinical environments such as the Emergency Department, which make the conduct of research challenging. There is limited evidence available to inform the development of simulated clinical scenarios for the purpose of evaluating practice in research studies, with the majority of literature focused on designing simulated clinical scenarios for education and training. Distinct differences exist in scenario design when implemented in education compared with use in clinical research studies. Simulated scenarios used to assess practice in clinical research must not comprise of any purposeful or planned teaching and be developed with a high degree of validity and reliability. A new scenario design template was devised to develop two standardised simulated clinical scenarios for the evaluation of a new assessment framework for emergency nurses. The scenario development and validation processes undertaken are described and provide an evidence-informed guide to scenario development for future clinical research studies.

  9. Game analytics for game user research, part 1: a workshop review and case study.

    PubMed

    El-Nasr, Magy Seif; Desurvire, Heather; Aghabeigi, Bardia; Drachen, Anders

    2013-01-01

    The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design, media studies, and the social sciences. They've extended and modified these methods for different types of digital games, such as social games, casual games, and serious games. This article focuses on quantitative analytics of in-game behavioral user data and its emergent use by the GUR community. The article outlines open problems emerging from several GUR workshops. In addition, a case study of a current collaboration between researchers and a game company demonstrates game analytics' use and benefits.

  10. Development of proteasome inhibitors as research tools and cancer drugs

    PubMed Central

    2012-01-01

    The proteasome is the primary site for protein degradation in mammalian cells, and proteasome inhibitors have been invaluable tools in clarifying its cellular functions. The anticancer agent bortezomib inhibits the major peptidase sites in the proteasome’s 20S core particle. It is a “blockbuster drug” that has led to dramatic improvements in the treatment of multiple myeloma, a cancer of plasma cells. The development of proteasome inhibitors illustrates the unpredictability, frustrations, and potential rewards of drug development but also emphasizes the dependence of medical advances on basic biological research. PMID:23148232

  11. Tissue fluid pressures - From basic research tools to clinical applications

    NASA Technical Reports Server (NTRS)

    Hargens, Alan R.; Akeson, Wayne H.; Mubarak, Scott J.; Owen, Charles A.; Gershuni, David H.

    1989-01-01

    This paper describes clinical applications of two basic research tools developed and refined in the past 20 years: the wick catheter (for measuring tissue fluid pressure) and the colloid osmometer (for measuring osmotic pressure). Applications of the osmometer include estimations of the reduced osmotic pressure of sickle-cell hemoglobin with deoxygenation, and of reduced swelling pressure of human nucleus pulposus with hydration or upon action of certain enzymes. Clinical uses of the wick-catheter technique include an improvement of diagnosis and treatment of acute and chronic compartment syndromes, the elucidation of the tissue pressure thresholds for neuromuscular dysfunction, and the development of a better tourniquet for orthopedics.

  12. Vaccinia Virus: A Tool for Research and Vaccine Development

    NASA Astrophysics Data System (ADS)

    Moss, Bernard

    1991-06-01

    Vaccinia virus is no longer needed for smallpox immunization, but now serves as a useful vector for expressing genes within the cytoplasm of eukaryotic cells. As a research tool, recombinant vaccinia viruses are used to synthesize biologically active proteins and analyze structure-function relations, determine the targets of humoral- and cell-mediated immunity, and investigate the immune responses needed for protection against specific infectious diseases. When more data on safety and efficacy are available, recombinant vaccinia and related poxviruses may be candidates for live vaccines and for cancer immunotherapy.

  13. Terahertz pulsed imaging, a novel process analytical tool to investigate the coating characteristics of push-pull osmotic systems.

    PubMed

    Malaterre, Vincent; Pedersen, Maireadh; Ogorka, Joerg; Gurny, Robert; Loggia, Nicoletta; Taday, Philip F

    2010-01-01

    The aim of this study was to investigate coating characteristics of push-pull osmotic systems (PPOS) using three-dimensional terahertz pulsed imaging (3D-TPI) and to detect physical alterations potentially impacting the drug release. The terahertz time-domain reflection signal was used to obtain information on both the spatial distribution of the coating thickness and the coating internal physical mapping. The results showed that (i) the thickness distribution of PPOS coating can be non-destructively analysed using 3D-TPI and (ii) internal physical alterations impacting the drug release kinetics were detectable by using the terahertz time-domain signal. Based on the results, the potential benefits of implementing 3D-TPI as quality control analytical tool were discussed.

  14. Development of a fast analytical tool to identify oil spillages employing infrared spectral indexes and pattern recognition techniques.

    PubMed

    Fresco-Rivera, P; Fernández-Varela, R; Gómez-Carracedo, M P; Ramírez-Villalobos, F; Prada, D; Muniategui, S; Andrade, J M

    2007-11-30

    A fast analytical tool based on attenuated total reflectance mid-IR spectrometry is presented to evaluate the origin of spilled hydrocarbons and to monitor their fate on the environment. Ten spectral band ratios are employed in univariate and multivariate studies (principal components analysis, cluster analysis, density functions - potential curves - and Kohonen self organizing maps). Two indexes monitor typical photooxidation processes, five are related to aromatic characteristics and three study aliphatic and branched chains. The case study considered here comprises 45 samples taken on beaches (from 2002 to 2005) after the Prestige carrier accident off the Galician coast and 104 samples corresponding to weathering studies deployed for the Prestige's fuel, four typical crude oils and a fuel oil. The univariate studies yield insightful views on the gross chemical evolution whereas the multivariate studies allow for simple and straightforward elucidations on whether the unknown samples match the Prestige's fuel. Besides, a good differentiation on the weathering patterns of light and heavy products is obtained.

  15. Exploring positioning as an analytical tool for understanding becoming mathematics teachers' identities

    NASA Astrophysics Data System (ADS)

    Skog, Kicki; Andersson, Annica

    2015-03-01

    The aim of this article is to explore how a sociopolitical analysis can contribute to a deeper understanding of critical aspects for becoming primary mathematics teachers' identities during teacher education. The question we ask is the following: How may power relations in university settings affect becoming mathematics teachers' subject positioning? We elaborate on the elusive and interrelated concepts of identity, positioning and power, seen as dynamic and changeable. As these concepts represent three interconnected parts of research analysis in an on-going larger project data from different sources will be used in this illustration. In this paper, we clarify the theoretical stance, ground the concepts historically and strive to connect them to research analysis. In this way, we show that power relations and subject positioning in social settings are critical aspects and need to be taken seriously into account if we aim at understanding becoming teachers' identities.

  16. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    PubMed

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-01-04

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter.

  17. HRMAS NMR spectroscopy combined with chemometrics as an alternative analytical tool to control cigarette authenticity.

    PubMed

    Shintu, Laetitia; Caldarelli, Stefano; Campredon, Mylène

    2013-11-01

    In this paper, we present for the first time the use of high-resolution magic angle spinning nuclear magnetic resonance (HRMAS NMR) spectroscopy combined with chemometrics as an alternative tool for the characterization of tobacco products from different commercial international brands as well as for the identification of counterfeits. Although cigarette filling is a very complex chemical mixture, we were able to discriminate between dark, bright, and additive-free cigarette blends belonging to six different filter-cigarette brands, commercially available, using an approach for which no extraction procedure is required. Second, we focused our study on a specific worldwide-distributed brand for which established counterfeits were available. We discriminated those from their genuine counterparts with 100% accuracy using unsupervised multivariate statistical analysis. The counterfeits that we analyzed showed a higher amount of nicotine and solanesol and a lower content of sugars, all endogenous tobacco leaf metabolites. This preliminary study demonstrates the great potential of HRMAS NMR spectroscopy to help in controlling cigarette authenticity.

  18. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    PubMed

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes.

  19. Visualising the past: potential applications of Geospatial tools to paleoclimate research

    NASA Astrophysics Data System (ADS)

    Cook, A.; Turney, C. S.

    2012-12-01

    Recent advances in geospatial data acquisition, analysis and web-based data sharing offer new possibilities for understanding and visualising past modes of change. The availability, accessibility and cost-effectiveness of data is better than ever. Researchers can access remotely sensed data including terrain models; use secondary data from large consolidated repositories; make more accurate field measurements and combine data from disparate sources to form a single asset. An increase in the quantity and consistency of data is coupled with subtle yet significant improvements to the way in which geospatial systems manage data interoperability, topological and textual integrity, resulting in more stable analytical and modelling environments. Essentially, researchers now have greater control and more confidence in analytical tools and outputs. Web-based data sharing is growing rapidly, enabling researchers to publish and consume data directly into their spatial systems through OGC-compliant Web Map Services (WMS), Web Feature Services (WFS) and Web Coverage Services (WCS). This has been implemented at institutional, organisational and project scale around the globe. Some institutions have gone one step further and established Spatial Data Infrastructures (SDI) based on Federated Data Structures where the participating data owners retain control over who has access to what. It is important that advances in knowledge are transferred to audiences outside the scientific community in a way that is interesting and meaningful. The visualisation of paleodata through multi-media offers significant opportunities to highlight the parallels and distinctions between past climate dynamics and the challenges of today and tomorrow. Here we present an assessment of key innovations that demonstrate how Geospatial tools can be applied to palaeo-research and used to communicate the results to a diverse array of audiences in the digital age.

  20. Integrating research tools to support the management of social-ecological systems under climate change

    USGS Publications Warehouse

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  1. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  2. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  3. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  4. Using circular questions as a tool in qualitative research.

    PubMed

    Evans, Nicola; Whitcombe, Steve

    2016-01-01

    Circular questions are used within systematic family therapy as a tool to generate multiple explanations and stories from a family situation and as a means to stimulate the curiosity of the therapist while avoiding their temptation to seek a one definitive explanation. To consider the potential for using this approach in qualitative research, with researchers using carefully crafted questions to invite respondents to provide information about the meanings behind a phenomenon or consider how relationships between people contribute to it. Drawing on examples from a study into children's mental health services, this paper discusses the application of the technique of circular questioning from systemic family therapy to qualitative research. The use of circular questions is a technique that qualitative researchers could employ in the field when conducting interviews with individuals or groups, or when engaged in participant observation as a means to obtain rich sources of data. Circular questioning can help to promote curiosity in the researcher and invite responses that illuminate relational issues between participants in a study.

  5. Development of an analytical tool to study power quality of AC power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. Alan; Kankam, M. David

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of the Electric Power Research Institute's HARMFLO program which assumes a three phase, balanced, AC system with loads of harmonic distortion. The modified power flow program can be used with single phase, AC systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20 kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of present results with published data. Although the results are not exact, the discrepancies are relatively small.

  6. Development of an analytical tool to study power quality of ac power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. A.; Kankam, M. D.

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of Electric Power Research Institute's HARMFLO program which assumes a three-phase, balanced, ac system with loads of harmonic distortion. The modified power flow program can be used with single phase, ac systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20-kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of the present results with published data. Although the results are not exact, the discrepancies are relatively small.

  7. Multicenter patient records research: security policies and tools.

    PubMed

    Behlen, F M; Johnson, S B

    1999-01-01

    The expanding health information infrastructure offers the promise of new medical knowledge drawn from patient records. Such promise will never be fulfilled, however, unless researchers first address policy issues regarding the rights and interests of both the patients and the institutions who hold their records. In this article, the authors analyze the interests of patients and institutions in light of public policy and institutional needs. They conclude that the multicenter study, with Institutional Review Board approval of each study at each site, protects the interests of both. "Anonymity" is no panacea, since patient records are so rich in information that they can never be truly anonymous. Researchers must earn and respect the trust of the public, as responsible stewards of facts about patients' lives. The authors find that computer security tools are needed to administer multicenter patient records studies and describe simple approaches that can be implemented using commercial database products.

  8. Basic statistical tools in research and data analysis.

    PubMed

    Ali, Zulfiqar; Bhaskar, S Bala

    2016-09-01

    Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  9. An evaluation tool for collaborative clinical research centers.

    PubMed

    Tragus, Robin; Cody, Jannine D

    2013-06-01

    There is a need for metrics that describe the full range of services provided by a clinical research unit; given that services have expanded to include such things as investigator training, regulatory compliance monitoring, and budget negotiations. We developed a tool and methodology that allows tracking of these expanded services. This not only allowed us to more accurately describe the work of the research unit staff, but to monitor the status of a study across the entire study lifespan from the idea to the publication. In addition to measuring work, it allows us to anticipate future needs in clinical staff and expertise because we are involved very early in study planning. We also expect that by analyzing these data from many studies over time, we will identify process barriers that will direct future program improvement.

  10. Application of metabonomic analytical techniques in the modernization and toxicology research of traditional Chinese medicine

    PubMed Central

    Lao, Yong-Min; Jiang, Jian-Guo; Yan, Lu

    2009-01-01

    In the recent years, a wide range of metabonomic analytical techniques are widely used in the modern research of traditional Chinese medicine (TCM). At the same time, the international community has attached increasing importance to TCM toxicity problems. Thus, many studies have been implemented to investigate the toxicity mechanisms of TCM. Among these studies, many metabonomic-based methods have been implemented to facilitate TCM toxicity investigation. At present, the most prevailing methods for TCM toxicity research are mainly single analysis techniques using only one analytical means. These techniques include nuclear magnetic resonance (NMR), gas chromatography-mass spectrometry (GC-MS), and liquid chromatography-mass spectrometry (LC-MS), etc.; with these techniques, some favourable outcomes have been gained in the toxic reaction studies of TCM, such as the action target organs assay, the establishment of action pattern, the elucidation of action mechanism and the exploration of action material foundation. However, every analytical technique has its advantages and drawbacks, no existing analytical technique can be versatile. Multi-analysed techniques can partially overcome the shortcomings of single-analysed techniques. Combination of GC-MS and LC-MS metabolic profiling approaches has unravelled the pathological outcomes of aristolochic acid-induced nephrotoxicity, which can not be achieved by single-analysed techniques. It is believed that with the further development of metabonomic analytical techniques, especially multi-analysed techniques, metabonomics will greatly promote TCM toxicity research and be beneficial to the modernization of TCM in terms of extending the application of modern means in the TCM safety assessment, assisting the formulation of TCM safety norms and establishing the international standards indicators. PMID:19508399

  11. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  12. ARM Climate Research Facility: Outreach Tools and Strategies

    NASA Astrophysics Data System (ADS)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  13. Near-infrared spectroscopy as a tool for driving research.

    PubMed

    Liu, Tao; Pelowski, Matthew; Pang, Changle; Zhou, Yuanji; Cai, Jianfeng

    2016-03-01

    Driving a motor vehicle requires various cognitive functions to process surrounding information, to guide appropriate actions, and especially to respond to or integrate with numerous contextual and perceptual hindrances or risks. It is, thus, imperative to examine driving performance and road safety from a perspective of cognitive neuroscience, which considers both the behaviour and the functioning of the brain. However, because of technical limitations of current brain imaging approaches, studies have primarily adopted driving games or simulators to present participants with simulated driving environments that may have less ecological validity. Near-infrared spectroscopy (NIRS) is a relatively new, non-invasive brain-imaging technique allowing measurement of brain activations in more realistic settings, even within real motor vehicles. This study reviews current NIRS driving research and explores NIRS' potential as a new tool to examine driving behaviour, along with various risk factors in natural situations, promoting our understanding about neural mechanisms of driving safety. Practitioner Summary: Driving a vehicle is dependent on a range of neurocognitive processing abilities. Near-infrared spectroscopy (NIRS) is a non-invasive brain-imaging technique allowing measurement of brain activation even in on-road studies within real motor vehicles. This study reviews current NIRS driving research and explores the potential of NIRS as a new tool to examine driving behaviour.

  14. CryoTEM as an Advanced Analytical Tool for Materials Chemists.

    PubMed

    Patterson, Joseph P; Xu, Yifei; Moradi, Mohammad-Amin; Sommerdijk, Nico A J M; Friedrich, Heiner

    2017-07-18

    Morphology plays an essential role in chemistry through the segregation of atoms and/or molecules into different phases, delineated by interfaces. This is a general process in materials synthesis and exploited in many fields including colloid chemistry, heterogeneous catalysis, and functional molecular systems. To rationally design complex materials, we must understand and control morphology evolution. Toward this goal, we utilize cryogenic transmission electron microscopy (cryoTEM), which can track the structural evolution of materials in solution with nanometer spatial resolution and a temporal resolution of <1 s. In this Account, we review examples of our own research where direct observations by cryoTEM have been essential to understanding morphology evolution in macromolecular self-assembly, inorganic nucleation and growth, and the cooperative evolution of hybrid materials. These three different research areas are at the heart of our approach to materials chemistry where we take inspiration from the myriad examples of complex materials in Nature. Biological materials are formed using a limited number of chemical components and under ambient conditions, and their formation pathways were refined during biological evolution by enormous trial and error approaches to self-organization and biomineralization. By combining the information on what is possible in nature and by focusing on a limited number of chemical components, we aim to provide an essential insight into the role of structure evolution in materials synthesis. Bone, for example, is a hierarchical and hybrid material which is lightweight, yet strong and hard. It is formed by the hierarchical self-assembly of collagen into a macromolecular template with nano- and microscale structure. This template then directs the nucleation and growth of oriented, nanoscale calcium phosphate crystals to form the composite material. Fundamental insight into controlling these structuring processes will eventually allow us

  15. A new analytical tool to assess health risks associated with the virological quality of drinking water (EMIRA study).

    PubMed

    Gofti-Laroche, L; Gratacap-Cavallier, B; Genoulaz, O; Joret, J C; Hartemann, P; Seigneurin, J M; Zmirou, D

    2001-01-01

    This work assessed the risks associated with the virological quality of tapwater using a molecular analytical tool manageable in a field survey. It combined a daily epidemiological follow-up of digestive morbidity among a panel of volunteers and a microbiological surveillance of drinking water. RT-PCR was used for detection of enterovirus, rotavirus and astrovirus. 712 cases of acute digestive conditions occurred in the 544 volunteers. 38% (9/24) raw water and 23% (10/44) tap water samples were positive for at least one virus marker with 9/10 positive tap water samples complying with bacterial criteria. No statistically significant association was found between the presence of viral markers and observed incidence of digestive morbidity. However, when an outbreak occurred, enterovirus and rotavirus RNA was detected in the corresponding stored tap water samples. Sequencing of the amplified fragments showed that the rotavirus detected was of bovine origin. This work demonstrated that enteric virus markers were common in tapwater of the study communities (characterised by a vulnerable raw water) despite absence of bacterial indicators. Tangential ultrafiltration coupled to RT-PCR allowed a simultaneous and fast detection of the study viruses from environmental samples. This process is a promising tool usable for virological water surveillance, in as much the corresponding know-how is transferred to the field professionals.

  16. [Analytics of ambiguity: methodological strategy to the phenomenological research in health].

    PubMed

    Sena, Edite Lago da Silva; Gonçalves, Lucia Hisako Takase; Granzotto, Marcos José Müller; Carvalho, Patricia Anjos Lima; Reis, Helca Franciolli Teixeira

    2010-12-01

    The strategy presented in this paper, called Analytics of ambiguity, is connected to the necessity of understanding findings in researches based on Merleau-Ponty's phenomenology. It was developed through a study of descriptions of life experiences from ten family members, members of a Mutual Help Group for caregivers of Alzheimer's patients, conducted at a university in Florianopolis, Santa Catarina, Brazil. Such descriptions were shown through interviews based on intercorporeal experience, during the writing of a Doctoral Dissertation in Nursing. The application of the Analytics of ambiguity to this study is consistent with other similar studies and opens up possibilities for the understanding of findings in phenomenological researches, specifically those based on the experiential ontology of Merleau-Ponty, for it enables us to recognize consciousness as something non-perceptible and perception as an always ambiguous process.

  17. Ultrasonic wavefield imaging: Research tool or emerging NDE method?

    NASA Astrophysics Data System (ADS)

    Michaels, Jennifer E.

    2017-02-01

    Ultrasonic wavefield imaging refers to acquiring full waveform data over a region of interest for waves generated by a stationary source. Although various implementations of wavefield imaging have existed for many years, the widespread availability of laser Doppler vibrometers that can acquire signals in the high kHz and low MHz range has resulted in a rapid expansion of fundamental research utilizing full wavefield data. In addition, inspection methods based upon wavefield imaging have been proposed for standalone nondestructive evaluation (NDE) with most of these methods coming from the structural health monitoring (SHM) community and based upon guided waves. If transducers are already embedded in or mounted on the structure as part of an SHM system, then a wavefield-based inspection can potentially take place with very little required disassembly. A frequently-proposed paradigm for wavefield NDE is its application as a follow-up inspection method using embedded SHM transducers as guided wave sources if the in situ SHM system generates an alarm. Discussed here is the broad role of wavefield imaging as it relates to ultrasonic NDE, both as a research tool and as an emerging NDE method. Examples of current research are presented based upon both guided and bulk wavefield imaging in metals and composites, drawing primarily from the author's work. Progress towards wavefield NDE is discussed in the context of defect detection and characterization capabilities, scan times, data quality, and required data analysis. Recent research efforts are summarized that can potentially enable wavefield NDE.

  18. Modeling as a research tool in poultry science.

    PubMed

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  19. NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Y. R.; Rasbury, J.; Johnson, J.; Barsten, K.; Saile, L.; Watkins, S. D.

    2011-01-01

    In preparation for exploration-class missions, the Exploration Medical Capability (ExMC) element of NASA's Human Research Program (HRP) has compiled a large evidence base, which previously was available only to persons within the NASA community. The evidence base is comprised of several types of data, for example: information on more than 80 medical conditions which could occur during space flight, derived from several sources (including data on incidence and potential outcomes of these medical conditions, as captured in the Integrated Medical Model's Clinical Finding Forms). In addition, approximately 35 gap reports are included in the evidence base, identifying current understanding of the medical challenges for exploration, as well as any gaps in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions. In an effort to make the ExMC information available to the general public and increase collaboration with subject matter experts within and outside of NASA, ExMC has developed an online collaboration tool, very similar to a wiki, titled the NASA Human Research Wiki. The platform chosen for this data sharing, and the potential collaboration it could generate, is a MediaWiki-based application that would house the evidence, allow "read only" access to all visitors to the website, and editorial access to credentialed subject matter experts who have been approved by the Wiki's editorial board. Although traditional wikis allow users to edit information in real time, the NASA Human Research Wiki includes a peer review process to ensure quality and validity of information. The wiki is also intended to be a pathfinder project for other HRP elements that may want to use this type of web-based tool. The wiki website will be released with a subset of the data described and will continue to be populated throughout the year.

  20. High-resolution entrainment mapping of gastric pacing: a new analytical tool.

    PubMed

    O'Grady, Gregory; Du, Peng; Lammers, Wim J E P; Egbuji, John U; Mithraratne, Pulasthi; Chen, Jiande D Z; Cheng, Leo K; Windsor, John A; Pullan, Andrew J

    2010-02-01

    Gastric pacing has been investigated as a potential treatment for gastroparesis. New pacing protocols are required to improve symptom and motility outcomes; however, research progress has been constrained by a limited understanding of the effects of electrical stimulation on slow-wave activity. This study introduces high-resolution (HR) "entrainment mapping" for the analysis of gastric pacing and presents four demonstrations. Gastric pacing was initiated in a porcine model (typical amplitude 4 mA, pulse width 400 ms, period 17 s). Entrainment mapping was performed using flexible multielectrode arrays (

  1. Proteomic analysis of synovial fluid as an analytical tool to detect candidate biomarkers for knee osteoarthritis.

    PubMed

    Liao, Weixiong; Li, Zhongli; Zhang, Hao; Li, Ji; Wang, Ketao; Yang, Yimeng

    2015-01-01

    We conducted research to detect the proteomic profiles in synovial fluid (SF) from knee osteoarthritis (OA) patients to better understand the pathogenesis and aetiology of OA. Our long-term goal is to identify reliable candidate biomarkers for OA in SF. The SF proteins obtained from 10 knee OA patients and 10 non-OA patients (9 of whom were patients with a meniscus injury in the knee; 1 had a discoid meniscus in the knee, and all exhibited intact articular cartilage) were separated by two-dimensional electrophoresis (2-DE). The repeatability of the obtained protein spots regarding their intensity was tested via triplicate 2-DE of selected samples. The observed protein expression patterns were subjected to statistical analysis, and differentially expressed protein spots were identified via matrix-assisted laser desorption/ionisation-time of flight/time of flight mass spectrometry (MALDI-TOF/TOF MS). Our analyses showed low intrasample variability and clear intersample variation. Among the protein spots observed on the gels, there were 29 significant differences, of which 22 corresponded to upregulation and 7 to downregulation in the OA group. One of the upregulated protein spots was confirmed to be haptoglobin by mass spectrometry, and the levels of haptoglobin in SF are positively correlated with the severity of OA (r = 0.89, P < 0.001). This study showed that 2-DE could be used under standard conditions to screen SF samples and identify a small subset of proteins in SF that are potential markers associated with OA. Spots of interest identified by mass spectrometry, such as haptoglobin, may be associated with OA severity.

  2. Constructing a publically available distracted driving database and research tool.

    PubMed

    Atchley, Paul; Tran, Ashleigh V; Salehinejad, Mohammad Ali

    2017-02-01

    The goal of the current work was to create a publicly available visualization tool of distracted driving research, the purpose of which is to allow the public and other stakeholders to empirically inform questions of their choice that may bear on policy discussions. Fifty years of distracted driving research was used to design a comprehensive database of studies that evaluated the effects of distraction on driving performance. Distraction sources (e.g., texting, talking, visual distraction) and performance measures were defined, and the sample of studies were evaluated and categorized by their measures. The final product yielded 342 studies using various methodologies. Across all measures, 1297 found distractions degraded driving performance, 54 found distraction improved driving performance, and 257 found distraction had no effect on driving performance. An analysis of the most common phone distractions (texting and talking) showed that texting almost always results in degraded performance. Aggregate data reveal no difference in performance decrements for hand-held or hands-free phones even though single studies of those variables vary in their outcomes. This project illustrates how scientific research can be made publically available for use by a diverse audience of stakeholders. An important result of this project is that data aggregated along a simple set of characteristics such as whether or not performance is decreased, improved or not affected, can reveal trends in the data that are less clear from any individual study. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Astonishing advances in mouse genetic tools for biomedical research.

    PubMed

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  4. Effective Tooling for Linked Data Publishing in Scientific Research

    SciTech Connect

    Purohit, Sumit; Smith, William P.; Chappell, Alan R.; West, Patrick; Lee, Benno; Stephan, Eric G.; Fox, Peter

    2016-02-05

    Challenges that make it difficult to find, share, and combine published data, such as data heterogeneity and resource discovery, have led to increased adoption of semantic data standards and data publishing technologies. To make data more accessible, interconnected and discoverable, some domains are being encouraged to publish their data as Linked Data. Consequently, this trend greatly increases the amount of data that semantic web tools are required to process, store, and interconnect. In attempting to process and manipulate large data sets, tools–ranging from simple text editors to modern triplestores– eventually breakdown upon reaching undefined thresholds. This paper offers a systematic approach that data publishers can use to categorize suitable tools to meet their data publishing needs. We present a real-world use case, the Resource Discovery for Extreme Scale Collaboration (RDESC), which features a scientific dataset(maximum size of 1.4 billion triples) used to evaluate a toolbox for data publishing in climate research. This paper also introduces a semantic data publishing software suite developed for the RDESC project.

  5. The relevance of attachment research to psychoanalysis and analytic social psychology.

    PubMed

    Bacciagaluppi, M

    1994-01-01

    The extensive empirical research generated by attachment theory is briefly reviewed, with special reference to transgenerational transmission of attachment patterns, internal working models, cross-cultural, and longitudinal studies. It is claimed that attachment theory and research support the alternative psychoanalytic approach initiated by Ferenczi, especially as regards the re-evaluation of real-life traumatic events, the occurrence of personality splits after childhood trauma, and the aggravation of trauma due to its denial by adults. The concepts of transgenerational transmission and of alternative developmental pathways are further contributions to an alternative psychoanalytic framework. Finally, attention is called to the relevance of the cross-cultural studies to Fromm's analytic social psychology.

  6. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    PubMed

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  7. Interactive Publication: The document as a research tool

    PubMed Central

    Thoma, George R.; Ford, Glenn; Antani, Sameer; Demner-Fushman, Dina; Chung, Michael; Simpson, Matthew

    2010-01-01

    The increasing prevalence of multimedia and research data generated by scientific work affords an opportunity to reformulate the idea of a scientific article from the traditional static document, or even one with links to supplemental material in remote databases, to a self-contained, multimedia-rich interactive publication. This paper describes our concept of such a document, and the design of tools for authoring (Forge) and visualization/analysis (Panorama). They are platform-independent applications written in Java, and developed in Eclipse1 using its Rich Client Platform (RCP) framework. Both applications operate on PDF files with links to XML files that define the media type, location, and action to be performed. We also briefly cite the challenges posed by the potentially large size of interactive publications, the need for evaluating their value to improved comprehension and learning, and the need for their long-term preservation by the National Library of Medicine and other libraries. PMID:20657757

  8. Conservation of Mass: An Important Tool in Renal Research.

    PubMed

    Sargent, John A

    2016-05-01

    The dialytic treatment of end-stage renal disease (ESRD) patients is based on control of solute concentrations and management of fluid volume. The application of the principal of conservation of mass, or mass balance, is fundamental to the study of such treatment and can be extended to chronic kidney disease (CKD) in general. This review discusses the development and use of mass conservation and transport concepts, incorporated into mathematical models. These concepts, which can be applied to a wide range of solutes of interest, represent a powerful tool for quantitatively guided studies of dialysis issues currently and into the future. Incorporating these quantitative concepts in future investigations is key to achieving positive control of known solutes, and in the analysis of such studies; to relate future research to known results of prior studies; and to help in the understanding of the obligatory physiological perturbations that result from dialysis therapy.

  9. Review and evaluation of electronic health records-driven phenotype algorithm authoring tools for clinical and translational research

    PubMed Central

    Rasmussen, Luke V; Shaw, Pamela L; Jiang, Guoqian; Kiefer, Richard C; Mo, Huan; Pacheco, Jennifer A; Speltz, Peter; Zhu, Qian; Denny, Joshua C; Pathak, Jyotishman; Thompson, William K; Montague, Enid

    2015-01-01

    Objective To review and evaluate available software tools for electronic health record–driven phenotype authoring in order to identify gaps and needs for future development. Materials and Methods Candidate phenotype authoring tools were identified through (1) literature search in four publication databases (PubMed, Embase, Web of Science, and Scopus) and (2) a web search. A collection of tools was compiled and reviewed after the searches. A survey was designed and distributed to the developers of the reviewed tools to discover their functionalities and features. Results Twenty-four different phenotype authoring tools were identified and reviewed. Developers of 16 of these identified tools completed the evaluation survey (67% response rate). The surveyed tools showed commonalities but also varied in their capabilities in algorithm representation, logic functions, data support and software extensibility, search functions, user interface, and data outputs. Discussion Positive trends identified in the evaluation included: algorithms can be represented in both computable and human readable formats; and most tools offer a web interface for easy access. However, issues were also identified: many tools were lacking advanced logic functions for authoring complex algorithms; the ability to construct queries that leveraged un-structured data was not widely implemented; and many tools had limited support for plug-ins or external analytic software. Conclusions Existing phenotype authoring tools could enable clinical researchers to work with electronic health record data more efficiently, but gaps still exist in terms of the functionalities of such tools. The present work can serve as a reference point for the future development of similar tools. PMID:26224336

  10. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    NASA Astrophysics Data System (ADS)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  11. Usage of an online tool to help policymakers better engage with research: Web CIPHER.

    PubMed

    Makkar, Steve R; Gilham, Frances; Williamson, Anna; Bisset, Kellie

    2015-04-23

    There is a need to develop innovations that help policymakers better engage with research in order to increase its use in policymaking. As part of the Centre for Informing Policy in Health with Evidence from Research (CIPHER), we established Web CIPHER, an online tool with dynamic interactive elements such as hot topics, research summaries, blogs from trusted figures in health policy and research, a community bulletin board, multimedia section and research portal. The aim of this study was to examine policymakers' use of the website, and determine which sections were key drivers of use. Google Analytics (GA) was used to gather usage data during a 16-month period. Analysis was restricted to Web CIPHER members from policy agencies. We examined descriptive statistics including mean viewing times, number of page visits and bounce rates for each section and performed analyses of variance to compare usage between sections. Repeated measures analyses were undertaken to examine whether a weekly reminder email improved usage of Web CIPHER, particularly for research-related content. During the measurement period, 223 policymakers from more than 32 organisations joined Web CIPHER. Users viewed eight posts on average per visit and stayed on the site for approximately 4 min. The bounce rate was less than 6%. The Blogs and Community sections received more unique views than all other sections. Blogs relating to improving policymakers' skills in applying research to policy were particularly popular. The email reminder had a positive effect on improving usage, particularly for research-related posts. The data indicated a relatively small number of users. However, this sample may not be representative of policymakers since membership to the site and usage was completely voluntarily. Nonetheless, those who used the site appeared to engage well with it. The findings suggest that providing blog-type content written by trusted experts in health policy and research as well as regular

  12. Expediting the formulation development process with the aid of automated dissolution in analytical research and development.

    PubMed

    Sadowitz, J P

    2001-01-01

    The development of drugs in the generic pharmaceutical industry is a highly competitive arena of companies vying for few drug products that are coming off patent. Companies that have been successful in this arena are those that have met or surpassed the critical timeline associated with trial formulation development, analytical method development, and submission batch manufacturing and testing. Barr Laboratories Inc., has been successful in the generic pharmaceutical industry for several reasons, one of which includes automation. The analytical research and development at Barr has employed the use of automated dissolution early in the lifecycle of a potential product. This approach has dramatically reduced the 'time to market' on average for a number of products. The key to this approach is the network infrastructure of the formulation and analytical research and development departments. At Barr, the cooperative ability to work and communicate together has driven the departments to streamline and matrix their work efforts and optimize resources and time. The discussion will reference how Barr has been successful with automation and gives a case study of products that have moved with rapid pace through the development cycle.

  13. Research Tensions with the Use of Timed Numeracy Fluency Assessments as a Research Tool

    ERIC Educational Resources Information Center

    Stott, Debbie; Graven, Mellony

    2013-01-01

    In this paper, we describe how we came to use timed fluency activities, along with personal learner reflections on those activities, in our after-school maths club as a complementary research and development tool for assessing the changing levels of learners' mathematical proficiency over time. We use data from one case-study after-school maths…

  14. Informetric Theories and Methods for Exploring the Internet: An Analytical Survey of Recent Research Literature.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit; Peritz, Bluma C.

    2002-01-01

    Presents a selective review of research based on the Internet, using bibliometric and informetric methods and tools. Highlights include data collection methods on the Internet, including surveys, logging, and search engines; and informetric analysis, including citation analysis and content analysis. (Contains 78 references.) (Author/LRW)

  15. An analytical framework for delirium research in palliative care settings: integrated epidemiologic, clinician-researcher, and knowledge user perspectives.

    PubMed

    Lawlor, Peter G; Davis, Daniel H J; Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H; Watanabe, Sharon; Currow, David C; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J; de Rooij, Sophia E J A; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J

    2014-08-01

    Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of "very" or "extremely" important confirmed their priority. Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework's research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  16. An Analytical Framework for Delirium Research in Palliative Care Settings: Integrated Epidemiologic, Clinician-Researcher, and Knowledge User Perspectives

    PubMed Central

    Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H.; Watanabe, Sharon; Currow, David C.; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J.; de Rooij, Sophia E.J.A.; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J.

    2014-01-01

    Context Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. Objectives The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. Methods We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Results Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of “very” or “extremely” important confirmed their priority. Conclusion Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework’s research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. PMID:24726762

  17. Prompt nuclear analytical techniques for material research in accelerator driven transmutation technologies: Prospects and quantitative analyses

    NASA Astrophysics Data System (ADS)

    Vacík, J.; Hnatowicz, V.; Červená, J.; Peřina, V.; Mach, R.; Peka, I.

    1998-04-01

    Accelerator driven transmutation technology (ADTT) is a promissing way toward liquidation of spent nuclear fuel, nuclear wastes and weapon grade Pu. The ADTT facility comprises a high current (proton) accelerator supplying a subcritical reactor assembly with spallation neutrons. The reactor part is supposed to be cooled by molten fluorides or metals which serve, at the same time, as a carrier of nuclear fuel. Assumed high working temperature (400-600°C) and high radiation load in the subcritical reactor and spallation neutron source put forward the problem of optimal choice of ADTT construction materials, especially from the point of their radiation and corrosion resistance when in contact with liquid working media. The use of prompt nuclear analytical techniques in ADTT related material research is considered and examples of preliminary analytical results obtained using neutron depth profiling method are shown for illustration.

  18. Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy

    NASA Astrophysics Data System (ADS)

    van Eijck, Michiel; Roth, Wolff-Michael

    2007-06-01

    Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to understand the use of tools in human activity, namely cultural-historical activity theory (CHAT). Accordingly, IT-based research tools constitute central moments of scientific research activity and neither can be seen apart from its objectives, nor can it be considered apart from the cultural-historical determined forms of activity (praxis) in which human subjects participate. Based on empirical data involving students participating in research activity, we point out how an appropriate account of IT-based research tools involves subjects' use of tools with respect to the objectives of research activity and the contribution to the praxis of research. We propose to reconceptualize the role of IT-based research tools as contributing to scientific literacy if students apply these tools with respect to the objectives of the research activity and contribute to praxis of research by evaluating and modifying the application of these tools. We conclude this paper by sketching the educational implications of this reconceptualized role of IT-based research tools.

  19. Nutriproteomics: a promising tool to link diet and diseases in nutritional research.

    PubMed

    Ganesh, Vijayalakshmi; Hettiarachchy, Navam S

    2012-10-01

    Nutriproteomics is a nascent research arena, exploiting the dynamics of proteomic tools to characterize molecular and cellular changes in protein expression and function on a global level as well as judging the interaction of proteins with food nutrients. As nutrients are present in complex mixtures, the bioavailability and functions of each nutrient can be influenced by the presence of other nutrients/compounds and interactions. The first half of this review focuses on the techniques used as nutriproteomic tools for identification, quantification, characterization and analyses of proteins including, two-dimensional polyacrylamide electrophoresis, chromatography, mass spectrometry, microarray and other emerging technologies involving visual proteomics. The second half narrates the potential of nutriproteomics in medical and nutritional research for revolutionizing biomarker and drug development, nutraceutical discovery, biological process modeling, preclinical nutrition linking diet and diseases and structuring ways to a personalized nutrition. Though several challenges such as protein dynamics, analytical complexity, cost and resolution still exist, the scope of applying proteomics to nutrition is rapidly expanding and promising as more holistic strategies are emerging.

  20. Researcher effects on mortality salience research: a meta-analytic moderator analysis.

    PubMed

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-08-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = .35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing the authorships of each study, two major TMT research teams were identified, namely, an American team (Team A) and an Israeli team (Team I). The majority of MS experiments were conducted by or related to a small number of researchers (Team A, 50.2%). The average effect size of Team A (r = .41) was significantly greater than that of other researchers (r = .30). Further analysis revealed that cultural differences found in TMT research may be due to the teams themselves, rather than to sample regions. The reasons behind these findings are proposed.

  1. Alerting strategies in computerized physician order entry: a novel use of a dashboard-style analytics tool in a children's hospital.

    PubMed

    Reynolds, George; Boyer, Dean; Mackey, Kevin; Povondra, Lynne; Cummings, Allana

    2008-11-06

    Utilizing a commercially available business analytics tool offering dashboard-style graphical indicators and a data warehouse strategy, we have developed an interactive, web-based platform that allows near-real-time analysis of CPOE adoption by hospital area and practitioner specialty. Clinical Decision Support (CDS) metrics include the percentage of alerts that result in a change in clinician decision-making. This tool facilitates adjustments in alert limits in order to reduce alert fatigue.

  2. The adaptive response metric: toward an all-hazards tool for planning, decision support, and after-action analytics.

    PubMed

    Potter, Margaret A; Schuh, Russell G; Pomer, Bruce; Stebbins, Samuel

    2013-01-01

    Local health departments are organized, resourced, and operated primarily for routine public health services. For them, responding to emergencies and disasters requires adaptation to meet the demands of an emergency, and they must reallocate or augment resources, adjust work schedules, and, depending on severity and duration of the event, even compromise routine service outputs. These adaptations occur to varying degrees regardless of the type of emergency or disaster. The Adaptive Response Metric was developed through collaboration between a number of California health departments and university-based preparedness researchers. It measures the degree of "stress" from an emergency response as experienced by local health departments at the level of functional units (eg, nursing, administration, environmental services). Pilot testing of the Adaptive Response Metric indicates its utility for emergency planning, real-time decision making, and after-action analytics.

  3. Models - Another tool for use in global change research

    SciTech Connect

    Wullschleger, S.D.; Baldocchi, D.D.; King, A.W.; Post, W.M. )

    1994-06-01

    Models are increasingly being used in the plant sciences to integrate and extrapolate information derived from laboratory and field investigations. To illustrate the utility of models in global change research, a series of leaf, canopy, ecosystem, and global-scale models are used to explore the response of trees to atmospheric CO[sub 2] enrichment. A biochemical model highlights the effects of elevated CO[sub 2] and temperature on photosynthesis, the consequences of Rubisco down-regulation to leaf and canopy carbon gain, and the relationships among stomatal conductance, transpiration, leaf area, and canopy energy balance. A forest succession model examines the effects of CO[sub 2] on species composition and forest productivity, while a model of the global carbon cycle illustrates the effects of rising CO[sub 2] on terrestrial carbon storage and the interaction of this affect with temperature. We conclude that models are appropriate tools for use both in guiding existing studies and in identifying new hypotheses for future research.

  4. The GATO gene annotation tool for research laboratories.

    PubMed

    Fujita, A; Massirer, K B; Durham, A M; Ferreira, C E; Sogayar, M C

    2005-11-01

    Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

  5. GPCR-targeting nanobodies: attractive research tools, diagnostics, and therapeutics.

    PubMed

    Mujić-Delić, Azra; de Wit, Raymond H; Verkaar, Folkert; Smit, Martine J

    2014-05-01

    G-protein-coupled receptors (GPCRs) represent a major therapeutic target class. A large proportion of marketed drugs exert their effect through modulation of GPCR function, and GPCRs have been successfully targeted with small molecules. Yet, the number of small new molecular entities targeting GPCRs that has been approved as therapeutics in the past decade has been limited. With new and improved immunization-related technologies and advances in GPCR purification and expression techniques, antibody-based targeting of GPCRs has gained attention. The serendipitous discovery of a unique class of heavy chain antibodies (hcAbs) in the sera of camelids may provide novel GPCR-directed therapies. Antigen-binding fragments of hcAbs, also referred to as nanobodies, combine the advantages of both small molecules (e.g., molecular cavity binding, low production costs) and monoclonal antibodies (e.g., high affinity and specificity). Nanobodies are gaining ground as therapeutics and are also starting to find application as diagnostics and as high-quality tools in GPCR research. Herein, we review recent advances in the use of nanobodies in GPCR research.

  6. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  7. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    SciTech Connect

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  8. Episcopic 3D Imaging Methods: Tools for Researching Gene Function

    PubMed Central

    Weninger, Wolfgang J; Geyer, Stefan H

    2008-01-01

    This work aims at describing episcopic 3D imaging methods and at discussing how these methods can contribute to researching the genetic mechanisms driving embryogenesis and tissue remodelling, and the genesis of pathologies. Several episcopic 3D imaging methods exist. The most advanced are capable of generating high-resolution volume data (voxel sizes from 0.5x0.5x1 µm upwards) of small to large embryos of model organisms and tissue samples. Beside anatomy and tissue architecture, gene expression and gene product patterns can be three dimensionally analyzed in their precise anatomical and histological context with the aid of whole mount in situ hybridization or whole mount immunohistochemical staining techniques. Episcopic 3D imaging techniques were and are employed for analyzing the precise morphological phenotype of experimentally malformed, randomly produced, or genetically engineered embryos of biomedical model organisms. It has been shown that episcopic 3D imaging also fits for describing the spatial distribution of genes and gene products during embryogenesis, and that it can be used for analyzing tissue samples of adult model animals and humans. The latter offers the possibility to use episcopic 3D imaging techniques for researching the causality and treatment of pathologies or for staging cancer. Such applications, however, are not yet routine and currently only preliminary results are available. We conclude that, although episcopic 3D imaging is in its very beginnings, it represents an upcoming methodology, which in short terms will become an indispensable tool for researching the genetic regulation of embryo development as well as the genesis of malformations and diseases. PMID:19452045

  9. Enabling laboratory EUV research with a compact exposure tool

    NASA Astrophysics Data System (ADS)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  10. Haystack, a web-based tool for metabolomics research

    PubMed Central

    2014-01-01

    Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization

  11. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research.

  12. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  13. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  14. Saturation Transfer Difference NMR as an Analytical Tool for Detection and Differentiation of Plastic Explosives on the Basis of Minor Plasticizer Composition

    DTIC Science & Technology

    2015-05-01

    SATURATION TRANSFER DIFFERENCE NMR AS AN ANALYTICAL TOOL FOR DETECTION AND DIFFERENTIATION OF PLASTIC EXPLOSIVES...ON THE BASIS OF MINOR PLASTICIZER COMPOSITION ECBC-TR-1291 Rossitza K. Gitti LEIDOS, INC. Gunpowder, MD 21010-0068 Stanley A. Ostazeski...Differentiation of Plastic Explosives on the Basis of Minor Plasticizer Composition 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  15. Research subjects for analytical estimation of core degradation at Fukushima-Daiichi nuclear power plant

    SciTech Connect

    Nagase, F.; Ishikawa, J.; Kurata, M.; Yoshida, H.; Kaji, Y.; Shibamoto, Y.; Amaya, M; Okumura, K.; Katsuyama, J.

    2013-07-01

    Estimation of the accident progress and status inside the pressure vessels (RPV) and primary containment vessels (PCV) is required for appropriate conductance of decommissioning in the Fukushima-Daiichi NPP. For that, it is necessary to obtain additional experimental data and revised models for the estimation using computer codes with increased accuracies. The Japan Atomic Energy Agency (JAEA) has selected phenomena to be reviewed and developed, considering previously obtained information, conditions specific to the Fukushima-Daiichi NPP accident, and recent progress of experimental and analytical technologies. As a result, research and development items have been picked up in terms of thermal-hydraulic behavior in the RPV and PCV, progression of fuel bundle degradation, failure of the lower head of RPV, and analysis of the accident. This paper introduces the selected phenomena to be reviewed and developed, research plans and recent results from the JAEA's corresponding research programs. (authors)

  16. Role of nuclear analytical probe techniques in biological trace element research

    SciTech Connect

    Jones, K.W.; Pounds, J.G.

    1985-01-01

    Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace elements metabolism, and of conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research, and examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team. 21 refs., 4 figs., 1 tab.

  17. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  18. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  19. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online.

  20. Research progress of pharmacological activities and analytical methods for plant origin proteins.

    PubMed

    Li, Chun-hong; Chen, Cen; Xia, Zhi-ning; Yang, Feng-qing

    2015-07-01

    As one of the important active components of traditional Chinese medicine (TCM), plant origin active proteins have many significant pharmacological functions. According to researches on the plant origin active proteins reported in recent years, pharmacological effects include anti-tumor, immune regulation, anti-oxidant, anti-pathogeny microorganism, anti-thrombus, as well as hypolipidemic and hypoglycemic activities of plant origin were reviewed, respectively. On the other hand, the analytical methods including chromatography, spectroscopy, electrophoresis and mass spectrometry for plant origin proteins analysis were also summarized. The main purpose of this paper is providing a reference for future development and application of plant active proteins.

  1. Giving raw data a chance to talk: a demonstration of exploratory visual analytics with a pediatric research database using Microsoft Live Labs Pivot to promote cohort discovery, research, and quality assessment.

    PubMed

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses.

  2. Giving Raw Data a Chance to Talk: A Demonstration of Exploratory Visual Analytics with a Pediatric Research Database Using Microsoft Live Labs Pivot to Promote Cohort Discovery, Research, and Quality Assessment

    PubMed Central

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V. Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses. PMID:24808811

  3. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  4. Advanced Methods in Meta-Analytic Research: Applications and Implications for Rehabilitation Counseling Research

    ERIC Educational Resources Information Center

    Rosenthal, David A.; Hoyt, William T.; Ferrin, James M.; Miller, Susan; Cohen, Nicholas D.

    2006-01-01

    Over the past 25 years, meta-analysis has assumed a significant role in the synthesis of counseling and psychotherapy research through the evaluation and interpretation of the results of multiple studies. An examination of four widely recognized rehabilitation counseling journals, however, reveals that only one meta-analysis (Bolton & Akridge,…

  5. Researcher Effects on Mortality Salience Research: A Meta-Analytic Moderator Analysis

    ERIC Educational Resources Information Center

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-01-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = 0.35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing…

  6. Researcher Effects on Mortality Salience Research: A Meta-Analytic Moderator Analysis

    ERIC Educational Resources Information Center

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-01-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = 0.35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing…

  7. Bioanalysis and Analytical Services Research Group at The Municipal Institute for Medical Research IMIM-Hospital del Mar, Spain.

    PubMed

    Segura, Jordi; Pascual, José A; Ventura, Rosa; Gutiérrez-Gallego, Ricardo

    2009-11-01

    Analytical laboratories involved in health-related research are becoming a fundamental part of the advancement of science in this field. Of particular interest to clinical, legal, toxicological, forensic and environmental matters is the analysis of drugs and medications present in biological fluids of consumers or exposed subjects. The established sensitive and reliable work of sports drug-testing laboratories represents an interesting example of a multidisciplinarity approach toward widespread bioanalytical problems. The experiences reported in this article will be of general interest, especially for analysts studying the determination of substances in biological material.

  8. The capsicum transcriptome DB: a "hot" tool for genomic research.

    PubMed

    Góngora-Castillo, Elsa; Fajardo-Jaime, Rubén; Fernández-Cortes, Araceli; Jofre-Garfias, Alba E; Lozoya-Gloria, Edmundo; Martínez, Octavio; Ochoa-Alejo, Neftalí; Rivera-Bustamante, Rafael

    2012-01-01

    Chili pepper (Capsicum annuum) is an economically important crop with no available public genome sequence. We describe a genomic resource to facilitate Capsicum annuum research. A collection of Expressed Sequence Tags (ESTs) derived from five C. annuum organs (root, stem, leaf, flower and fruit) were sequenced using the Sanger method and multiple leaf transcriptomes were deeply sampled using with GS-pyrosequencing. A hybrid assembly of 1,324,516 raw reads yielded 32,314 high quality contigs as validated by coverage and identity analysis with existing pepper sequences. Overall, 75.5% of the contigs had significant sequence similarity to entries in nucleic acid and protein databases; 23% of the sequences have not been previously reported for C. annuum and expand sequence resources for this species. A MySQL database and a user-friendly Web interface were constructed with search-tools that permit queries of the ESTs including sequence, functional annotation, Gene Ontology classification, metabolic pathways, and assembly information. The Capsicum Transcriptome DB is free available from http://www.bioingenios.ira.cinvestav.mx:81/Joomla/

  9. Microgravity as a research tool to improve US agriculture

    NASA Astrophysics Data System (ADS)

    Bula, R. J.; Stankovic, Bratislav

    2000-01-01

    Crop production and utilization are undergoing significant modifications and improvements that emanate from adaptation of recently developed plant biotechnologies. Several innovative technologies will impact US agriculture in the next century. One of these is the transfer of desirable genes from organisms to economically important crop species in a way that cannot be accomplished with traditional plant breeding techniques. Such plant genetic engineering offers opportunities to improve crop species for a number of characteristics as well as use as source materials for specific medical and industrial applications. Although plant genetic engineering is having an impact on development of new crop cultivars, several major constraints limit the application of this technology to selected crop species and genotypes. Consequently, gene transfer systems that overcome these constraints would greatly enhance development of new crop materials. If results of a recent gene transfer experiment conducted in microgravity during a Space Shuttle mission are confirmed, and with the availability of the International Space Station as a permanent space facility, commercial plant transformation activity in microgravity could become a new research tool to improve US agriculture. .

  10. Databases and registers: useful tools for research, no studies.

    PubMed

    Curbelo, Rafael J; Loza, Estíbaliz; de Yébenes, Maria Jesús García; Carmona, Loreto

    2014-04-01

    There are many misunderstandings about databases. Database is a commonly misused term in reference to any set of data entered into a computer. However, true databases serve a main purpose, organising data. They do so by establishing several layers of relationships; databases are hierarchical. Databases commonly organise data over different levels and over time, where time can be measured as the time between visits, or between treatments, or adverse events, etc. In this sense, medical databases are closely related to longitudinal observational studies, as databases allow the introduction of data on the same patient over time. Basically, we could establish four types of databases in medicine, depending on their purpose: (1) administrative databases, (2) clinical databases, (3) registers, and (4) study-oriented databases. But a database is a useful tool for a large variety of studies, not a type of study itself. Different types of databases serve very different purposes, and a clear understanding of the different research designs mentioned in this paper would prevent many of the databases we launch from being just a lot of work and very little science.

  11. Towards a 'One Health' research and application tool box.

    PubMed

    Zinsstag, Jakob; Schelling, Esther; Bonfoh, Bassirou; Fooks, Anthony R; Kasymbekov, Joldoshbek; Waltner-Toews, David; Tanner, Marcel

    2009-01-01

    The 'One Medicine' concept by Calvin Schwabe has seen an unprecedented revival in the last decade and has evolved towards 'One Health' conceptual thinking, emphasising epidemiology and public health. Pathologists rightly recall the contribution of their discipline by close genomic relationship of animals and humans e.g. in cancer genetics. We need to change our 'us versus them' perspective towards a perspective of 'shared risk' between humans and animals. Professional organisations have declared their adhesion, governments have created joint public and animal health working groups and numerous research and surveillance programmes have been incepted as demonstrated on the 'One Health Initiative' website. Above all these beneficial developments, we should not forget however, that there remains a huge divide between human and veterinary medicine borne from unprecedented (over) specialisation of disciplines and increasingly reductionist approaches to scientific inquiry. What is required now is a radical paradigm shift in our approach to global public health with practical approaches and 'hands-on' examples to facilitate its application and accelerating necessary leverage of 'One Health'. We propose elements of an open 'tool box' translating the 'One Health' concept into practical methods in the fields of integrated disease surveillance, joint animal-human epidemiological studies and health services development, which we hope might serve as a discussion basis for mutually agreed practical cooperation between human and animal health with special emphasis on developing countries.

  12. Genetic research in schizophrenia: new tools and future perspectives.

    PubMed

    Bertram, Lars

    2008-09-01

    Genetically, schizophrenia is a complex disease whose pathogenesis is likely governed by a number of different risk factors. While substantial efforts have been made to identify the underlying susceptibility alleles over the past 2 decades, they have been of only limited success. Each year, the field is enriched with nearly 150 additional genetic association studies, each of which either proposes or refutes the existence of certain schizophrenia genes. To facilitate the evaluation and interpretation of these findings, we have recently created a database for genetic association studies in schizophrenia ("SzGene"; available at http://www.szgene.org). In addition to systematically screening the scientific literature for eligible studies, SzGene also reports the results of allele-based meta-analyses for polymorphisms with sufficient genotype data. Currently, these meta-analyses highlight not only over 20 different potential schizophrenia genes, many of which represent the "usual suspects" (eg, various dopamine receptors and neuregulin 1), but also several that were never meta-analyzed previously. All the highlighted loci contain at least one variant showing modest (summary odds ratios approximately 1.20 [range 1.06-1.45]) but nominally significant risk effects. This review discusses some of the strengths and limitations of the SzGene database, which could become a useful bioinformatics tool within the schizophrenia research community.

  13. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Astrophysics Data System (ADS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-07-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  14. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  15. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    NASA Astrophysics Data System (ADS)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  16. Tracing the sources of refractory dissolved organic matter in a large artificial lake using multiple analytical tools.

    PubMed

    Nguyen, Hang Vo-Minh; Hur, Jin

    2011-10-01

    Structural and chemical characteristics of refractory dissolved organic matter (RDOM) from seven different sources (algae, leaf litter, reed, compost, field soil, paddy water, treated sewage) were examined using multiple analytical tools, and they were compared with those of RDOM in a large artificial lake (Lake Paldang, Korea). Treated sewage, paddy water, and field soil were distinguished from the other sources investigated by their relatively low specific UV absorbance (SUVA) values and more pronounced fulvic-like versus humic-like fluorescence of the RDOM samples. Microbial derived RDOM from algae and treated sewage showed relatively low apparent molecular weight and a higher fraction of hydrophilic bases relative to the total hydrophilic fraction. For the biopolymer types, the presence of polyhydroxy aromatics with the high abundance of proteins was observed only for vascular plant-based RDOM (i.e., leaf litter and reed). Molecular weight values exhibited positive correlations with the SUVA and the hydrophobic content among the different RDOM, suggesting that hydrophobic and condensed aromatic structures may be the main components of high molecular weight RDOM. Principal component analysis revealed that approximately 77% of the variance in the RDOM characteristics might be explained by the source difference (i.e., terrestrial and microbial derived) and a tendency of further microbial transformation. Combined results demonstrated that the properties of the lake RDOM were largely affected by the upstream sources of field soil, paddy water, and treated sewage, which are characterized by low molecular weight UV-absorbing and non-aromatic structures with relatively high resistance to further degradation.

  17. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  18. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  19. A collaborative approach to develop a multi-omics data analytics platform for translational research.

    PubMed

    Schumacher, Axel; Rujan, Tamas; Hoefkens, Jens

    2014-12-01

    The integration and analysis of large datasets in translational research has become an increasingly challenging problem. We propose a collaborative approach to integrate established data management platforms with existing analytical systems to fill the hole in the value chain between data collection and data exploitation. Our proposal in particular ensures data security and provides support for widely distributed teams of researchers. As a successful example for such an approach, we describe the implementation of a unified single platform that combines capabilities of the knowledge management platform tranSMART and the data analysis system Genedata Analyst™. The combined end-to-end platform helps to quickly find, enter, integrate, analyze, extract, and share patient- and drug-related data in the context of translational R&D projects.

  20. Big data, advanced analytics and the future of comparative effectiveness research.

    PubMed

    Berger, Marc L; Doban, Vitalii

    2014-03-01

    The intense competition that accompanied the growth of internet-based companies ushered in the era of 'big data' characterized by major innovations in processing of very large amounts of data and the application of advanced analytics including data mining and machine learning. Healthcare is on the cusp of its own era of big data, catalyzed by the changing regulatory and competitive environments, fueled by growing adoption of electronic health records, as well as efforts to integrate medical claims, electronic health records and other novel data sources. Applying the lessons from big data pioneers will require healthcare and life science organizations to make investments in new hardware and software, as well as in individuals with different skills. For life science companies, this will impact the entire pharmaceutical value chain from early research to postcommercialization support. More generally, this will revolutionize comparative effectiveness research.

  1. Concept Maps as a Research and Evaluation Tool To Assess Conceptual Change in Quantum Physics.

    ERIC Educational Resources Information Center

    Sen, Ahmet Ilhan

    2002-01-01

    Informs teachers about using concept maps as a learning tool and alternative assessment tools in education. Presents research results of how students might use concept maps to communicate their cognitive structure. (Author/KHR)

  2. Conducting qualitative research in the British Armed Forces: theoretical, analytical and ethical implications.

    PubMed

    Finnegan, Alan

    2014-06-01

    The aim of qualitative research is to produce empirical evidence with data collected through means such as interviews and observation. Qualitative research encourages diversity in the way of thinking and the methods used. Good studies produce a richness of data to provide new knowledge or address extant problems. However, qualitative research resulting in peer review publications within the Defence Medical Services (DMS) is a rarity. This article aims to help redress this balance by offering direction regarding qualitative research in the DMS with a focus on choosing a theoretical framework, analysing the data and ethical approval. Qualitative researchers need an understanding of the paradigms and theories that underpin methodological frameworks, and this article includes an overview of common theories in phenomenology, ethnography and grounded theory, and their application within the military. It explains qualitative coding: the process used to analyse data and shape the analytical framework. A popular four phase approach with examples from an operational nursing research study is presented. Finally, it tackles the issue of ethical approval for qualitative studies and offers direction regarding the research proposal and participant consent. The few qualitative research studies undertaken in the DMS have offered innovative insights into defence healthcare providing information to inform and change educational programmes and clinical practice. This article provides an extra resource for clinicians to encourage studies that will improve the operational capability of the British Armed Forces. It is anticipated that these guidelines are transferable to research in other Armed Forces and the military Veterans population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Analytical combustion/emissions research related to the NASA high-speed research program

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1991-01-01

    Increasing the pressure and temperature of the engines of new generation supersonic airliners increases the emissions of nitrogen oxides to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of implementing low emissions combustor technologies, NASA Lewis Research Center has pursued a combustion analysis program to guide combustor design processes, to identify potential concepts of greatest promise, and to optimize them at low cost, with short turn-around time. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts have been made to improve the code capabilities of modeling the physics. Test cases and experiments are used for code validation. To provide insight into the combustion process and combustor design, two-dimensional and three-dimensional codes such as KIVA-II and LeRC 3D have been used. These codes are operational and calculations have been performed to guide low emissions combustion experiments.

  4. Dynamic Visual Acuity: a Functionally Relevant Research Tool

    NASA Technical Reports Server (NTRS)

    Peters, Brian T.; Brady, Rachel A.; Miller, Chris A.; Mulavara, Ajitkumar P.; Wood, Scott J.; Cohen, Helen S.; Bloomberg, Jacob J.

    2010-01-01

    Coordinated movements between the eyes and head are required to maintain a stable retinal image during head and body motion. The vestibulo-ocular reflex (VOR) plays a significant role in this gaze control system that functions well for most daily activities. However, certain environmental conditions or interruptions in normal VOR function can lead to inadequate ocular compensation, resulting in oscillopsia, or blurred vision. It is therefore possible to use acuity to determine when the environmental conditions, VOR function, or the combination of the two is not conductive for maintaining clear vision. Over several years we have designed and tested several tests of dynamic visual acuity (DVA). Early tests used the difference between standing and walking acuity to assess decrements in the gaze stabilization system after spaceflight. Supporting ground-based studies measured the responses from patients with bilateral vestibular dysfunction and explored the effects of visual target viewing distance and gait cycle events on walking acuity. Results from these studies show that DVA is affected by spaceflight, is degraded in patients with vestibular dysfunction, changes with target distance, and is not consistent across the gait cycle. We have recently expanded our research to include studies in which seated subjects are translated or rotated passively. Preliminary results from this work indicate that gaze stabilization ability may differ between similar active and passive conditions, may change with age, and can be affected by the location of the visual target with respect to the axis of motion. Use of DVA as a diagnostic tool is becoming more popular but the functional nature of the acuity outcome measure also makes it ideal for identifying conditions that could lead to degraded vision. By doing so, steps can be taken to alter the problematic environments to improve the man-machine interface and optimize performance.

  5. Typology of Analytical Errors in Qualitative Educational Research: An Analysis of the 2003-2007 Education Science Dissertations in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    In this research, the level of quality of the qualitative research design used and the analytic mistakes made in the doctorate dissertations carried out in the field of education science in Turkey have been tried to be identified. Case study design has been applied in the study in which qualitative research techniques have been used. The universe…

  6. Typology of Analytical Errors in Qualitative Educational Research: An Analysis of the 2003-2007 Education Science Dissertations in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    In this research, the level of quality of the qualitative research design used and the analytic mistakes made in the doctorate dissertations carried out in the field of education science in Turkey have been tried to be identified. Case study design has been applied in the study in which qualitative research techniques have been used. The universe…

  7. PARAMO: A Parallel Predictive Modeling Platform for Healthcare Analytic Research using Electronic Health Records

    PubMed Central

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng

    2014-01-01

    Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate

  8. PARAMO: a PARAllel predictive MOdeling platform for healthcare analytic research using electronic health records.

    PubMed

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng

    2014-04-01

    Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines

  9. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  10. Research Tools, Tips, and Resources for Financial Aid Administrators. Monograph, A NASFAA Series.

    ERIC Educational Resources Information Center

    Mohning, David D.; Redd, Kenneth E.; Simmons, Barry W., Sr.

    This monograph provides research tools, tips, and resources to financial aid administrators who need to undertake research tasks. It answers: What is research? How can financial aid administrators get started on research projects? What resources are available to help answer research questions quickly and accurately? How can research efforts assist…

  11. MEETING TODAY'S EMERGING CONTAMINANTS WITH TOMORROW'S RESEARCH TOOL

    EPA Science Inventory

    This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park).

  12. MEETING TODAY'S EMERGING CONTAMINANTS WITH TOMORROW'S RESEARCH TOOL

    EPA Science Inventory

    This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park).

  13. Analytical performance and comparability of the determination of cholesterol by 12 Lipid-Research Clinics.

    PubMed

    Lippel, K; Ahmed, S; Albers, J J; Bachorik, P; Cooper, G; Helms, R; Williams, J

    1977-09-01

    Twelve Lipid-Research Clinic laboratories performed automated cholesterol analyses on four control-serum pools of known cholesterol concentration, using the Liebermann-Burchard reaction. The analyses were done during a two-year period, with the same standards, methodology, and quality-control procedures. Estimates of analytical bias, variability, and short- and long-term trends for each instrument and for the entire group of LRC instruments are presented. High accuracy, precision, and interlaboratory comparability were achieved through the rigorous standardization and control of the entire analytical procedure. The significance of these results for long-term collaborative studies is discussed. Individual laboratory biases averaged from 0.5 to 2.0% below Abell-Kendall reference values. Between-run variability was about equal to within-run variability and inter-laboratory variation was substantially less than intra-laboratory variation. The total standard deviation for all instruments was about 0.04 g/liter. Only 8-15% of this variation was due to differences between instruments. The between-instrument standard deviation ranged from 0.011 to 0.015 g/liter; the between-run, within-instrument standard deviation ranged from 0.023 to 0.030 g/liter; and within-run standard deviation ranged from 0.023 to 0.028 g/liter. The significance of the achieved results for long-term collaborative studies is discussed.

  14. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    PubMed

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Process mining is an underutilized clinical research tool in transfusion medicine.

    PubMed

    Quinn, Jason G; Conrad, David M; Cheng, Calvino K

    2017-03-01

    To understand inventory performance, transfusion services commonly use key performance indicators (KPIs) as summary descriptors of inventory efficiency that are graphed, trended, and used to benchmark institutions. Here, we summarize current limitations in KPI-based evaluation of blood bank inventory efficiency and propose process mining as an ideal methodology for application to inventory management research to improve inventory flows and performance. The transit of a blood product from inventory receipt to final disposition is complex and relates to many internal and external influences, and KPIs may be inadequate to fully understand the complexity of the blood supply chain and how units interact with its processes. Process mining lends itself well to analysis of blood bank inventories, and modern laboratory information systems can track nearly all of the complex processes that occur in the blood bank. Process mining is an analytical tool already used in other industries and can be applied to blood bank inventory management and research through laboratory information systems data using commercial applications. Although the current understanding of real blood bank inventories is value-centric through KPIs, it potentially can be understood from a process-centric lens using process mining. © 2017 AABB.

  16. Variance decomposition: a tool enabling strategic improvement of the precision of analytical recovery and concentration estimates associated with microorganism enumeration methods.

    PubMed

    Schmidt, P J; Emelko, M B; Thompson, M E

    2014-05-15

    Concentrations of particular types of microorganisms are commonly measured in various waters, yet the accuracy and precision of reported microorganism concentration values are often questioned due to the imperfect analytical recovery of quantitative microbiological methods and the considerable variation among fully replicated measurements. The random error in analytical recovery estimates and unbiased concentration estimates may be attributable to several sources, and knowing the relative contribution from each source can facilitate strategic design of experiments to yield more precise data or provide an acceptable level of information with fewer data. Herein, variance decomposition using the law of total variance is applied to previously published probabilistic models to explore the relative contributions of various sources of random error and to develop tools to aid experimental design. This work focuses upon enumeration-based methods with imperfect analytical recovery (such as enumeration of Cryptosporidium oocysts), but the results also yield insights about plating methods and microbial methods in general. Using two hypothetical analytical recovery profiles, the variance decomposition method is used to explore 1) the design of an experiment to quantify variation in analytical recovery (including the size and precision of seeding suspensions and the number of samples), and 2) the design of an experiment to estimate a single microorganism concentration (including sample volume, effects of improving analytical recovery, and replication). In one illustrative example, a strategically designed analytical recovery experiment with 6 seeded samples would provide as much information as an alternative experiment with 15 seeded samples. Several examples of diminishing returns are illustrated to show that efforts to reduce error in analytical recovery and concentration estimates can have negligible effect if they are directed at trivial error sources.

  17. Accelerator mass spectrometry as a bioanalytical tool for nutritional research

    SciTech Connect

    Vogel, J.S.; Turteltaub, K.W.

    1997-09-01

    Accelerator Mass Spectrometry is a mass spectrometric method of detecting long-lived radioisotopes without regard to their decay products or half-life. The technique is normally applied to geochronology, but recently has been developed for bioanalytical tracing. AMS detects isotope concentrations to parts per quadrillion, quantifying labeled biochemicals to attomole levels in milligram- sized samples. Its advantages over non-isotopeic and stable isotope labeling methods are reviewed and examples of analytical integrity, sensitivity, specificity, and applicability are provided.

  18. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    PubMed

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  19. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    PubMed

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  20. Methodological Challenges in Research on Sexual Risk Behavior: I. Item Content, Scaling, and Data Analytical Options

    PubMed Central

    Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.

    2008-01-01

    Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027

  1. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  2. Big Data analytics and cognitive computing - future opportunities for astronomical research

    NASA Astrophysics Data System (ADS)

    Garrett, M. A.

    2014-10-01

    The days of the lone astronomer with his optical telescope and photographic plates are long gone: Astronomy in 2025 will not only be multi-wavelength, but multi-messenger, and dominated by huge data sets and matching data rates. Catalogues listing detailed properties of billions of objects will in themselves require a new industrial-scale approach to scientific discovery, requiring the latest techniques of advanced data analytics and an early engagement with the first generation of cognitive computing systems. Astronomers have the opportunity to be early adopters of these new technologies and methodologies - the impact can be profound and highly beneficial to effecting rapid progress in the field. Areas such as SETI research might favourably benefit from cognitive intelligence that does not rely on human bias and preconceptions.

  3. EXAMPLES OF THE ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH

    EPA Science Inventory

    Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...

  4. "This Ain't the Projects": A Researcher's Reflections on the Local Appropriateness of Our Research Tools

    ERIC Educational Resources Information Center

    Martinez, Danny C.

    2016-01-01

    In this article I examine the ways in which Black and Latina/o urban high school youth pressed me to reflexively examine my positionality and that of my research tools during a year-long ethnographic study documenting their communicative repertoires. I reflect on youth comments on my researcher tools, as well as myself, in order to wrestle with…

  5. "This Ain't the Projects": A Researcher's Reflections on the Local Appropriateness of Our Research Tools

    ERIC Educational Resources Information Center

    Martinez, Danny C.

    2016-01-01

    In this article I examine the ways in which Black and Latina/o urban high school youth pressed me to reflexively examine my positionality and that of my research tools during a year-long ethnographic study documenting their communicative repertoires. I reflect on youth comments on my researcher tools, as well as myself, in order to wrestle with…

  6. [Research on infrared safety protection system for machine tool].

    PubMed

    Zhang, Shuan-Ji; Zhang, Zhi-Ling; Yan, Hui-Ying; Wang, Song-De

    2008-04-01

    In order to ensure personal safety and prevent injury accident in machine tool operation, an infrared machine tool safety system was designed with infrared transmitting-receiving module, memory self-locked relay and voice recording-playing module. When the operator does not enter the danger area, the system has no response. Once the operator's whole or part of body enters the danger area and shades the infrared beam, the system will alarm and output an control signal to the machine tool executive element, and at the same time, the system makes the machine tool emergency stop to prevent equipment damaged and person injured. The system has a module framework, and has many advantages including safety, reliability, common use, circuit simplicity, maintenance convenience, low power consumption, low costs, working stability, easy debugging, vibration resistance and interference resistance. It is suitable for being installed and used in different machine tools such as punch machine, pour plastic machine, digital control machine, armor plate cutting machine, pipe bending machine, oil pressure machine etc.

  7. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer

  8. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  9. Benchtop-NMR and MRI--a new analytical tool in drug delivery research.

    PubMed

    Metz, Hendrik; Mäder, Karsten

    2008-12-08

    During the last years, NMR spectroscopy and NMR imaging (magnetic resonance imaging, MRI) have been increasingly used to monitor drug delivery systems in vitro and in vivo. However, high installation and running costs of the commonly used superconducting magnet technology limits the application range and prevents the further spread of this non-invasive technology. Benchtop-NMR (BT-NMR) relaxometry uses permanent magnets and is much less cost intensive. BT-NMR relaxometry is commonly used in the food and chemical industry, but so far scarcely used in the pharmaceutical field. The paper shows on several examples that the application field of BT-NMR relaxometry can be extended into the field of drug delivery, including the characterisation of emulsions and lipid ingredients (e.g. the amount and physicochemical state of the lipid) and the monitoring of adsorption characteristics (e.g. oil binding of porous ingredients). The most exciting possibilities of BT-NMR technology are linked with the new development of BT-instruments with imaging capability. BT-MRI examples on the monitoring of hydration and swelling of HPMC-based monolayer and double-layer tablets are shown. BT-MRI opens new MRI opportunities for the non-invasive monitoring of drug delivery processes.

  10. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  11. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  12. Introduction to tools and techniques for ceramide-centered research.

    PubMed

    Kitatani, Kazuyuki; Luberto, Chiara

    2010-01-01

    Sphingolipids are important components of eukaryotic cells, many of which function as bioactive signaling molecules. As thoroughly discussed elsewhere in this volume, ceramide, central metabolite of the sphingolipid pathway, plays key roles in a variety of cellular responses. Since the discovery of the bioactive function of ceramide, a growing number of tools and techniques have been and still are being developed in order to better decipher the complexity and implications of ceramide-mediated signaling. With this chapter it is our intention to provide new comers to the sphingolipid arena with a short overview of tools and techniques currently available for the study ofsphingolipid metabolism, with the focus on ceramide.

  13. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  14. Practical library research: a tool for effective library management.

    PubMed

    Schneider, E; Mankin, C J; Bastille, J D

    1995-01-01

    Librarians are being urged to conduct research as one of their professional responsibilities. Many librarians, however, avoid research, because they believe it is beyond their capabilities or resources. This paper discusses the importance of conducting applied research-research directed toward solving practical problems. The paper describes how one library conducted practical research projects, including use studies and surveys, over an eighteen-year period. These projects produced objective data that were used by the library to make management decisions that benefited both the library and its parent institution. This paper encourages other librarians to conduct practical research projects and to share the results with their colleagues through publication in the professional literature.

  15. [Small compounds libraries: a research tool for chemical biology].

    PubMed

    Florent, Jean-Claude

    2013-01-01

    Obtaining and screening collections of small molecules remain a challenge for biologists. Recent advances in analytical techniques and instrumentation now make screening possible in academia. The history of the creation of such public or commercial collections and their accessibility is related. It shows that there is interest for an academic laboratory involved in medicinal chemistry, chemogenomics or "chemical biology" to organize its own collection and make it available through existing networks such as the French National chimiothèque or the European partner network "European Infrastructure of open screening platforms for Chemical Biology" EU-OpenScreen under construction. © Société de Biologie, 2013.

  16. An overview of city analytics

    PubMed Central

    Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter

    2017-01-01

    We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454

  17. Empirical-Analytical Methodological Research in Environmental Education: Response to a Negative Trend in Methodological and Ideological Discussions

    ERIC Educational Resources Information Center

    Connell, Sharon

    2006-01-01

    The purpose of this paper is to contribute to methodological discourse about research approaches to environmental education. More specifically, the paper explores the current status of the "empirical-analytical methodology" and its "positivist" (traditional- and post-positivist) ideologies, in environmental education research through the critical…

  18. Chaos Modeling: Increasing Educational Researchers' Awareness of a New Tool.

    ERIC Educational Resources Information Center

    Bobner, Ronald F.; And Others

    Chaos theory is being used as a tool to study a wide variety of phenomena. It is a philosophical and empirical approach that attempts to explain relationships previously thought to be totally random. Although some relationships are truly random, many data appear to be random but reveal repeatable patterns of behavior under further investigation.…

  19. Preservice Teachers as Researchers: Using Ethnographic Tools To Interpret Practice.

    ERIC Educational Resources Information Center

    Christensen, Lois McFadyen

    The structures of meaning preservice teachers perceived and interpreted as a result of field placements in a methods course and through the use of ethnographic tools were studied in an ethnographic design. The study involved 11 preservice teachers. It described how they shaped each other's thinking about teaching and it examined how ethnographic…

  20. Chaos Modeling: Increasing Educational Researchers' Awareness of a New Tool.

    ERIC Educational Resources Information Center

    Bobner, Ronald F.; And Others

    Chaos theory is being used as a tool to study a wide variety of phenomena. It is a philosophical and empirical approach that attempts to explain relationships previously thought to be totally random. Although some relationships are truly random, many data appear to be random but reveal repeatable patterns of behavior under further investigation.…