Science.gov

Sample records for analytical tool research

  1. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    PubMed

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  2. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  3. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  4. Narrative health research: exploring big and small stories as analytical tools.

    PubMed

    Sools, Anneke

    2013-01-01

    In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that conceptualizes health as a multidimensional, positive, dynamic and morally dilemmatic yet meaningful practice. I compare big and small stories as analytical tools to explore what narrative has to offer to address, nuance and complicate five challenges in narrative health research: (1) the interplay between health and other life issues; (2) the taken-for-granted yet rare character of the experience of good health; (3) coherence or incoherence as norms for good health; (4) temporal issues; (5) health as moral practice. In this article, I do not present research findings per se; rather, I use two interview excerpts for methodological and theoretical reflections. These interview excerpts are derived from a health promotion study in the Netherlands, which was partly based on peer-to-peer interviews. I conclude with a proposal to advance narrative health research by sensitizing researchers to different usages of both narrative and health, and the interrelationship(s) between the two.

  5. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  6. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated.

  7. Researching and Doing Professional Development Using a Shared Discursive Resource and an Analytic Tool

    ERIC Educational Resources Information Center

    Adler, Jill

    2015-01-01

    Linked research and development forms the central pillar of the 5-year Wits Maths Connect Secondary Project in South Africa. Our empirical data emphasised the need for teaching that mediates towards mathematics viewed as a network of scientific concepts, and the development of the notion of 'mathematical discourse in instruction' (MDI), as an…

  8. Analytic tools for information warfare

    SciTech Connect

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  9. Guidance for the Design and Adoption of Analytic Tools.

    SciTech Connect

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  10. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...early guidance and organizational contacts to get us started in this effort. They provided a basic cultural/ institutional /psychological framework for the...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to

  11. Analytical tool requirements for power system restoration

    SciTech Connect

    Adibi, M.M. ); Borkoski, J.N. ); Kafka, R.J. )

    1994-08-01

    This paper is one of series presented by Power System Restoration Working Group (SRWG) on behalf of the System, Operation Subcommittee with the intent of focusing industry attention on power system restoration. In this paper a set of analytical tools is specified which together describe the static, transient and dynamic behavior of a power system during restoration. These tools are identified and described for restoration planning, training and operation. Their applications cover all stages of restoration including pre-disturbance condition, post-disturbance status, post-restoration target system, and minimization of unserved loads. The paper draws on the previous reports by the SRWG.

  12. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water

  13. Using the Conceptual Change Model of Learning as An Analytic Tool in Researching Teacher Preparation for Student Diversity

    ERIC Educational Resources Information Center

    Larkin, Douglas

    2012-01-01

    Background/Context: In regard to preparing prospective teachers for diverse classrooms, the agenda for teacher education research has been primarily concerned with identifying desired outcomes and promising strategies. Scholarship in multicultural education has been crucial for identifying the knowledge, skills, and attitudes needed by teachers to…

  14. Using Visual Analytics Tool for Improving Data Comprehension

    ERIC Educational Resources Information Center

    Géryk, Jan

    2015-01-01

    The efficacy of animated data visualizations in comparison with static data visualizations is still inconclusive. Some researches resulted that the failure to find out the benefits of animations may relate to the way how they are constructed and perceived. In this paper, we present visual analytics (VA) tool which makes use of enhanced animated…

  15. Chemometrics tools used in analytical chemistry: an overview.

    PubMed

    Kumar, Naveen; Bansal, Ankit; Sarma, G S; Rawal, Ravindra K

    2014-06-01

    This article presents various important tools of chemometrics utilized as data evaluation tools generated by various hyphenated analytical techniques including their application since its advent to today. The work has been divided into various sections, which include various multivariate regression methods and multivariate resolution methods. Finally the last section deals with the applicability of chemometric tools in analytical chemistry. The main objective of this article is to review the chemometric methods used in analytical chemistry (qualitative/quantitative), to determine the elution sequence, classify various data sets, assess peak purity and estimate the number of chemical components. These reviewed methods further can be used for treating n-way data obtained by hyphenation of LC with multi-channel detectors. We prefer to provide a detailed view of various important methods developed with their algorithm in favor of employing and understanding them by researchers not very familiar with chemometrics.

  16. Analytical tools and isolation of TOF events

    NASA Technical Reports Server (NTRS)

    Wolf, H.

    1974-01-01

    Analytical tools are presented in two reports. The first is a probability analysis of the orbital distribution of events in relation to dust flux density observed in Pioneer 8 and 9 distributions. A distinction is drawn between asymmetries caused by random fluctuations and systematic variations, by calculating the probability of any particular asymmetry. The second article discusses particle trajectories for a repulsive force field. The force on a particle due to solar radiation pressure is directed along the particle's radius vector, from the sun, and is inversely proportional to its distance from the sun. Equations of motion which describe both solar radiation pressure and gravitational attraction are presented.

  17. Analytical Web Tool for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  18. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  19. Cryogenic Propellant Feed System Analytical Tool Development

    NASA Technical Reports Server (NTRS)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  20. Measurement and Research Tools.

    ERIC Educational Resources Information Center

    1997

    This document contains four papers from a symposium on measurement and research tools for human resource development (HRD). "The 'Best Fit' Training: Measure Employee Learning Style Strengths" (Daniel L. Parry) discusses a study of the physiological aspect of sensory intake known as modality, more specifically, modality as measured by…

  1. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  2. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase 1

  3. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    SciTech Connect

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    SciTech Connect

    Robinson P. Khosah; Charles G. Crawford

    2005-08-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirty-sixth month of development activities.

  7. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  8. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  9. Aircraft as Research Tools

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Aeronautical research usually begins with computers, wind tunnels, and flight simulators, but eventually the theories must fly. This is when flight research begins, and aircraft are the primary tools of the trade. Flight research involves doing precision maneuvers in either a specially built experimental aircraft or an existing production airplane that has been modified. For example, the AD-1 was a unique airplane made only for flight research, while the NASA F-18 High Alpha Research Vehicle (HARV) was a standard fighter aircraft that was transformed into a one-of-a-kind aircraft as it was fitted with new propulsion systems, flight controls, and scientific equipment. All research aircraft are able to perform scientific experiments because of the onboard instruments that record data about its systems, aerodynamics, and the outside environment. Since the 1970's, NASA flight research has become more comprehensive, with flights involving everything form Space Shuttles to ultralights. NASA now flies not only the fastest airplanes, but some of the slowest. Flying machines continue to evolve with new wing designs, propulsion systems, and flight controls. As always, a look at today's experimental research aircraft is a preview of the future.

  10. ANALYTICAL TOOLS FOR GROUNDWATER POLLUTION ASSESSMENT

    EPA Science Inventory

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of groundwater buffer strips. The indices describe the leaching of solutes below the root zone (mass f...

  11. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  12. Analytical tools for groundwater pollution assessment

    SciTech Connect

    Hantush, M.M.; Islam, M.R.; Marino, M.A.

    1998-06-01

    This paper deals with the development of analytical screening-exposure models (indices) and their potential application to regulate the use of hazardous chemicals and the design of ground water buffer strips. The indices describe the leaching of solutes below the root zone (mass fraction), emissions to the water table, and mass fraction of the contaminant intercepted by a well or a surface water body.

  13. ANALYTICAL CHEMISTRY RESEARCH NEEDS FOR ...

    EPA Pesticide Factsheets

    The consensus among environmental scientists and risk assessors is that the fate and effects of pharmaceutical and personal care products (PPCPS) in the environment are poorly understood. Many classes of PPCPs have yet to be investigated. Acquisition of trends data for a suite of PPCPs (representatives from each of numerous significant classes), shown to recur amongst municipal wastewater treatment plants across the country, may prove of key importance. The focus of this paper is an overview of some of the analytical methods being developed at the Environmenental Protection Agency and their application to wastewater and surface water samples. Because PPCPs are generally micro-pollutants, emphasis is on development of enrichment and pre- concentration techniques using various means of solid-phase extraction. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCP

  14. Trial analytics--a tool for clinical trial management.

    PubMed

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  15. Electronic tongue: An analytical gustatory tool.

    PubMed

    Latha, Rewanthwar Swathi; Lakshmi, P K

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  16. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  17. Informetrics: Exploring Databases as Analytical Tools.

    ERIC Educational Resources Information Center

    Wormell, Irene

    1998-01-01

    Advanced online search facilities and information retrieval techniques have increased the potential of bibliometric research. Discusses three case studies carried out by the Centre for Informetric Studies at the Royal School of Library Science (Denmark) on the internationality of international journals, informetric analyses on the World Wide Web,…

  18. Tool for Ranking Research Options

    NASA Technical Reports Server (NTRS)

    Ortiz, James N.; Scott, Kelly; Smith, Harold

    2005-01-01

    Tool for Research Enhancement Decision Support (TREDS) is a computer program developed to assist managers in ranking options for research aboard the International Space Station (ISS). It could likely also be adapted to perform similar decision-support functions in industrial and academic settings. TREDS provides a ranking of the options, based on a quantifiable assessment of all the relevant programmatic decision factors of benefit, cost, and risk. The computation of the benefit for each option is based on a figure of merit (FOM) for ISS research capacity that incorporates both quantitative and qualitative inputs. Qualitative inputs are gathered and partly quantified by use of the time-tested analytical hierarchical process and used to set weighting factors in the FOM corresponding to priorities determined by the cognizant decision maker(s). Then by use of algorithms developed specifically for this application, TREDS adjusts the projected benefit for each option on the basis of levels of technical implementation, cost, and schedule risk. Based partly on Excel spreadsheets, TREDS provides screens for entering cost, benefit, and risk information. Drop-down boxes are provided for entry of qualitative information. TREDS produces graphical output in multiple formats that can be tailored by users.

  19. Analytical Tools for Cloudscope Ice Measurement

    NASA Technical Reports Server (NTRS)

    Arnott, W. Patrick

    1998-01-01

    The cloudscope is a ground or aircraft instrument for viewing ice crystals impacted on a sapphire window. It is essentially a simple optical microscope with an attached compact CCD video camera whose output is recorded on a Hi-8 mm video cassette recorder equipped with digital time and date recording capability. In aircraft operation the window is at a stagnation point of the flow so adiabatic compression heats the window to sublimate the ice crystals so that later impacting crystals can be imaged as well. A film heater is used for ground based operation to provide sublimation, and it can also be used to provide extra heat for aircraft operation. The compact video camera can be focused manually by the operator, and a beam splitter - miniature bulb combination provide illumination for night operation. Several shutter speeds are available to accommodate daytime illumination conditions by direct sunlight. The video images can be directly used to qualitatively assess the crystal content of cirrus clouds and contrails. Quantitative size spectra are obtained with the tools described in this report. Selected portions of the video images are digitized using a PCI bus frame grabber to form a short movie segment or stack using NIH (National Institute of Health) Image software with custom macros developed at DRI. The stack can be Fourier transform filtered with custom, easy to design filters to reduce most objectionable video artifacts. Particle quantification of each slice of the stack is performed using digital image analysis. Data recorded for each particle include particle number and centroid, frame number in the stack, particle area, perimeter, equivalent ellipse maximum and minimum radii, ellipse angle, and pixel number. Each valid particle in the stack is stamped with a unique number. This output can be used to obtain a semiquantitative appreciation of the crystal content. The particle information becomes the raw input for a subsequent program (FORTRAN) that

  20. Empire: An Analytical Category for Educational Research

    ERIC Educational Resources Information Center

    Coloma, Roland Sintos

    2013-01-01

    In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…

  1. Analytical Ultrasonics in Materials Research and Testing

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1986-01-01

    Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.

  2. ANALYTICAL TOOL DEVELOPMENT FOR AFTERTREATMENT SUB-SYSTEMS INTEGRATION

    SciTech Connect

    Bolton, B; Fan, A; Goney, K; Pavlova-MacKinnon, Z; Sisken, K; Zhang, H

    2003-08-24

    The stringent emissions standards of 2007 and beyond require complex engine, aftertreatment and vehicle systems with a high degree of sub-system interaction and flexible control solutions. This necessitates a system-based approach to technology development, in addition to individual sub-system optimization. Analytical tools can provide an effective means to evaluate and develop such complex technology interactions as well as understand phenomena that is either too expensive or impossible to study with conventional experimental means. The analytical effort can also guide experimental development and thus lead to efficient utilization of available experimental resources.A suite of analytical models has been developed to represent PM and NOx aftertreatment sub-systems. These models range from computationally inexpensive zero-dimensional models for real-time control applications to CFD-based, multi-dimensional models with detailed temporal and spatial resolution. Such models in conjunction with well established engine modeling tools such as engine cycle simulation, engine controls modeling, CFD models of non-combusting and combusting flow, and vehicle models provide a comprehensive analytical toolbox for complete engine, aftertreatment and vehicle sub-systems development and system integration applications. However, the fidelity of aftertreatment models and application going forward is limited by the lack of fundamental kinetic data.

  3. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  4. Analytical Research on Developmental Aspects of Metamemory.

    ERIC Educational Resources Information Center

    Plude, Dana J.; Nelson, Thomas O.; Scholnick, Ellin K.

    1998-01-01

    Reviews selected pioneering findings in the child-developmental and adulthood-aging literature and evaluates them within the framework of Nelson (Thomas O.) and Narens' (Louis) (1990) theory of metamemory. Makes suggestions for conceptually-based analytical research to help specify the mechanisms that underlie developmental differences in…

  5. Promoting Efficacy Research on Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  6. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  7. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  8. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and shall propose improvements in system-wide models and analytical tools required for the evaluation and... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In...

  9. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    SciTech Connect

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating between

  10. A Tool for Medical Research

    NASA Technical Reports Server (NTRS)

    1992-01-01

    California Measurements, Inc.'s PC-2 Aerosol Particle Analyzer, developed by William Chiang, a former Jet Propulsion Laboratory (JPL) engineer, was used in a study to measure the size of particles in the medical environment. Chiang has a NASA license for the JPL crystal oscillator technology and originally built the instrument for atmospheric research. In the operating room, it enabled researchers from the University of California to obtain multiple sets of data repeatedly and accurately. The study concluded that significant amounts of aerosols are generated during surgery when power tools are employed, and most of these are in the respirable size. Almost all contain blood and are small enough to pass through surgical masks. Research on the presence of blood aerosols during oral surgery had similar results. Further studies are planned to determine the possibility of HIV transmission during surgery, and the PC-2H will be used to quantify blood aerosols.

  11. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    ERIC Educational Resources Information Center

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  12. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  13. Development of computer-based analytical tool for assessing physical protection system

    SciTech Connect

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  14. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  15. MATRIICES - Mass Analytical Tool for Reactions in Interstellar ICES

    NASA Astrophysics Data System (ADS)

    Isokoski, K.; Bossa, J. B.; Linnartz, H.

    2011-05-01

    The formation of complex organic molecules (COMs) observed in the inter- and circumstellar medium (ISCM) is driven by a complex chemical network yet to be fully characterized. Interstellar dust grains and the surrounding ice mantles, subject to atom bombardment, UV irradiation, and thermal processing, are believed to provide catalytic sites for such chemistry. However, the solid state chemical processes and the level of complexity reachable under astronomical conditions remain poorly understood. The conventional laboratory techniques used to characterize the solid state reaction pathways - RAIRS (Reflection Absorption IR Spectroscopy) and TPD (Temperature-Programmed Desorption) - are suitable for the analysis of reactions in ices made of relatively small molecules. For more complex ices comprising a series of different components as relevant to the interstellar medium, spectral overlapping prohibits unambiguous identification of reaction schemes, and these techniques start to fail. Therefore, we have constructed a new and innovative experimental set up for the study of complex interstellar ices featuring a highly sensitive and unambiguous detection method. MATRIICES (Mass Analytical Tool for Reactions in Interstellar ICES) combines Laser Ablation technique with a molecular beam experiment and Time-Of-Flight Mass Spectrometry (LA-TOF-MS) to sample and analyze the ice analogues in situ, at native temperatures, under clean ultra-high vacuum conditions. The method allows direct sampling and analysis of the ice constituents in real time, by using a pulsed UV ablation laser (355-nm Nd:YAG) to vaporize the products in a MALDI-TOF like detection scheme. The ablated material is caught in a synchronously pulsed molecular beam of inert carrier gas (He) from a supersonic valve, and analysed in a Reflectron Time-of-Flight Mass Spectrometer. The detection limit of the method is expected to exceed that of the regular surface techniques substantially. The ultimate goal is to fully

  16. Scalable Combinatorial Tools for Health Disparities Research

    PubMed Central

    Langston, Michael A.; Levine, Robert S.; Kilbourne, Barbara J.; Rogers, Gary L.; Kershenbaum, Anne D.; Baktash, Suzanne H.; Coughlin, Steven S.; Saxton, Arnold M.; Agboto, Vincent K.; Hood, Darryl B.; Litchveld, Maureen Y.; Oyana, Tonny J.; Matthews-Juarez, Patricia; Juarez, Paul D.

    2014-01-01

    Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject. PMID:25310540

  17. Assessment Tools' Indicators for Sustainability in Universities: An Analytical Overview

    ERIC Educational Resources Information Center

    Alghamdi, Naif; den Heijer, Alexandra; de Jonge, Hans

    2017-01-01

    Purpose: The purpose of this paper is to analyse 12 assessment tools of sustainability in universities and develop the structure and the contents of these tools to be more intelligible. The configuration of the tools reviewed highlight indicators that clearly communicate only the essential information. This paper explores how the theoretical…

  18. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  19. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  20. MHK Research, Tools, and Methods

    SciTech Connect

    Jepsen, Richard

    2011-11-02

    Presentation from the 2011 Water Peer Review in which principal investigator discusses improved testing, analysis, and design tools needed to more accurately model operational conditions, to optimize design parameters, and predict technology viability.

  1. Plant Aquaporins: Genome-Wide Identification, Transcriptomics, Proteomics, and Advanced Analytical Tools

    PubMed Central

    Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.

    2016-01-01

    Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is

  2. Single cell analytic tools for drug discovery and development

    PubMed Central

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  3. Observatory Bibliographies as Research Tools

    NASA Astrophysics Data System (ADS)

    Rots, Arnold H.; Winkelman, S. L.

    2013-01-01

    Traditionally, observatory bibliographies were maintained to provide insight in how successful a observatory is as measured by its prominence in the (refereed) literature. When we set up the bibliographic database for the Chandra X-ray Observatory (http://cxc.harvard.edu/cgi-gen/cda/bibliography) as part of the Chandra Data Archive ((http://cxc.harvard.edu/cda/), very early in the mission, our objective was to make it primarily a useful tool for our user community. To achieve this we are: (1) casting a very wide net in collecting Chandra-related publications; (2) including for each literature reference in the database a wealth of metadata that is useful for the users; and (3) providing specific links between the articles and the datasets in the archive that they use. As a result our users are able to browse the literature and the data archive simultaneously. As an added bonus, the rich metadata content and data links have also allowed us to assemble more meaningful statistics about the scientific efficacy of the observatory. In all this we collaborate closely with the Astrophysics Data System (ADS). Among the plans for future enhancement are the inclusion of press releases and the Chandra image gallery, linking with ADS semantic searching tools, full-text metadata mining, and linking with other observatories' bibliographies. This work is supported by NASA contract NAS8-03060 (CXC) and depends critically on the services provided by the ADS.

  4. Blogging as a Research Tool

    NASA Astrophysics Data System (ADS)

    Sweetser, Douglas

    2011-11-01

    I work on variations of the Maxwell Lagrange density using quaternions and hypercomplex products of covariant 4-derivatives and 4-potentials. The hope is to unify gravity with the symmetries found in the standard model. It is difficult for someone outside academia to get constructive criticism. I have chosen to blog once a week at Science20.com since March, 2011. Over thirty blogs have been generated, most getting more than a thousand views (high mark is 5k for ``Why Quantum Mechanics is Wierd''). The tools used for web and video blogging will be reviewed. A discussion of my efforts to represent electroweak symmetry with quaternions convinced me I was in error. Instead, my hope is to exploit the observation that U(1) is formally a subgroup of SU(2). A battle over gauge symmetry may be reviewed.

  5. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  6. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  7. Galileo's Discorsi as a Tool for the Analytical Art.

    PubMed

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  8. Polymerase chain reaction technology as analytical tool in agricultural biotechnology.

    PubMed

    Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping

    2005-01-01

    The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.

  9. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  10. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  11. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  12. Process models: analytical tools for managing industrial energy systems

    SciTech Connect

    Howe, S O; Pilati, D A; Balzer, C; Sparrow, F T

    1980-01-01

    How the process models developed at BNL are used to analyze industrial energy systems is described and illustrated. Following a brief overview of the industry modeling program, the general methodology of process modeling is discussed. The discussion highlights the important concepts, contents, inputs, and outputs of a typical process model. A model of the US pulp and paper industry is then discussed as a specific application of process modeling methodology. Applications addressed with the case study results include projections of energy demand, conservation technology assessment, energy-related tax policies, and sensitivity analysis. A subsequent discussion of these results supports the conclusion that industry process models are versatile and powerful tools for managing industrial energy systems.

  13. Horses for courses: analytical tools to explore planetary boundaries

    NASA Astrophysics Data System (ADS)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  14. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    PubMed

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns.

  15. Research as an educational tool

    SciTech Connect

    Neff, R.; Perlmutter, D.; Klaczynski, P.

    1994-12-31

    Our students have participated in original group research projects focused on the natural environment which culminate in a written manuscript published in-house, and an oral presentation to peers, faculty, and the university community. Our goal has been to develop their critical thinking skills so that they will be more successful in high school and college. We have served ninety-three students (47.1% white, 44.1% black, 5.4% hispanic, 2.2% American Indian, 1.2% asian) from an eight state region in the southeast over the past three years. Thirty-one students have graduated from high school with over 70% enrolled in college and another thirty-four are seniors this year. We are tracking students` progress in college and are developing our own critical thinking test to measure the impact of our program. Although preliminary, the results from the critical thinking test indicated that students are often prone to logical errors; however, higher levels of critical thinking were observed on items which raised issues that conflicted with students` pre-existing beliefs.

  16. Quality management system for application of the analytical quality assurance cycle in a research project

    NASA Astrophysics Data System (ADS)

    Camargo, R. S.; Olivares, I. R. B.

    2016-07-01

    The lack of quality assurance and quality control in academic activities have been recognized by the inability to demonstrate reproducibility. This paper aim to apply a quality tool called Analytical Quality Assurance Cycle on a specific research project, supported by a Verification Programme of equipment and an adapted Quality Management System based on international standards, to provide traceability to the data generated.

  17. Planning Research on Student Services: Variety in Research Tools.

    ERIC Educational Resources Information Center

    Hom, Willard C.

    This paper discusses the seven types of research tools that have potential for advancing knowledge about student services in California Community Colleges. The seven tools are the following: literature review, data validation, survey research, case study, quasi experiment, meta analysis, and statistical modeling. The report gives reasons why each…

  18. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    PubMed Central

    2012-01-01

    Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics. PMID:23153033

  19. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    SciTech Connect

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  20. Enabling Research Tools for Sustained Climate Assessment

    NASA Technical Reports Server (NTRS)

    Leidner, Allison K.; Bosilovich, Michael G.; Jasinski, Michael F.; Nemani, Ramakrishna R.; Waliser, Duane Edward; Lee, Tsengdar J.

    2016-01-01

    The U.S. Global Change Research Program Sustained Assessment process benefits from long-term investments in Earth science research that enable the scientific community to conduct assessment-relevant science. To this end, NASA initiated several research programs over the past five years to support the Earth observation community in developing indicators, datasets, research products, and tools to support ongoing and future National Climate Assessments. These activities complement NASA's ongoing Earth science research programs. One aspect of the assessment portfolio funds four "enabling tools" projects at NASA research centers. Each tool leverages existing capacity within the center, but has developed tailored applications and products for National Climate Assessments. The four projects build on the capabilities of a global atmospheric reanalysis (MERRA-2), a continental U.S. land surface reanalysis (NCA-LDAS), the NASA Earth Exchange (NEX), and a Regional Climate Model Evaluation System (RCMES). Here, we provide a brief overview of each enabling tool, highlighting the ways in which it has advanced assessment science to date. We also discuss how the assessment community can access and utilize these tools for National Climate Assessments and other sustained assessment activities.

  1. Bringing Research Tools into the Classroom

    ERIC Educational Resources Information Center

    Shubert, Charles; Ceraj, Ivica; Riley, Justin

    2009-01-01

    The advancement of computer technology used for research is creating the need to change the way classes are taught in higher education. "Bringing Research Tools into the Classroom" has become a major focus of the work of the Office of Educational Innovation and Technology (OEIT) for the Dean of Undergraduate Education (DUE) at the…

  2. High fidelity simulation as a research tool.

    PubMed

    Littlewood, Keith E

    2011-12-01

    Medical simulation has grown explosively over the last decade. Simulation is becoming commonplace in clinical education but can also be used as an investigative clinical tool in its own right. There are thus two arms of simulation in clinical research. The first is investigation of the clinical impact of simulation as an educational tool and the second as an instrument to assess the function of clinical practitioners and systems. This article reviews the terminology, current practice and current research in simulation. The use of simulation in assessment of the clinical performance of devices, people and systems will then be discussed and some current work in these areas presented. Finally, medical simulation will be discussed within the paradigm of translational research. Early examples of this 'tool-bench to bedside' model will be presented as possible prototypes for future work directed towards patient safety.

  3. Biomedical research tools from the seabed.

    PubMed

    Folmer, Florence; Houssen, Wael E; Scott, Roderick H; Jaspars, Marcel

    2007-03-01

    This review covers the applications of small-molecule and peptidic compounds isolated from marine organisms for biomedical research. Enzymes and proteins from marine sources are already on the market for biomedical applications, but the use of small-molecule biomedical research tools of marine origin is less developed. For many studies involving these molecules the ultimate goal is the application of small-molecule therapeutics in the clinic, but those that do not succeed in the clinic still have clearly defined biological activities, which may be of use as biomedical research tools. In other cases, the investigation of marine-derived compounds has led directly to the discovery of therapeutics with clinical applications. Both as tools and therapeutics, these small-molecule compounds are effective for investigating biological processes, and in this review the authors have chosen to concentrate on the ability of marine natural products to affect membrane processes, ion channels and intracellular processes.

  4. Chemometric classification techniques as a tool for solving problems in analytical chemistry.

    PubMed

    Bevilacqua, Marta; Nescatelli, Riccardo; Bucci, Remo; Magrì, Andrea D; Magrì, Antonio L; Marini, Federico

    2014-01-01

    Supervised pattern recognition (classification) techniques, i.e., the family of chemometric methods whose aim is the prediction of a qualitative response on a set of samples, represent a very important assortment of tools for solving problems in several areas of applied analytical chemistry. This paper describes the theory behind the chemometric classification techniques most frequently used in analytical chemistry together with some examples of their application to real-world problems.

  5. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  6. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 41 Public Contracts and Property Management 3 2013-07-01 2013-07-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  7. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 41 Public Contracts and Property Management 3 2014-01-01 2014-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  8. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false What analytical and... Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be used to support the life safety equivalency evaluation? Analytical and empirical tools, including...

  9. The use of meta-analytical tools in risk assessment for food safety.

    PubMed

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  10. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    USGS Publications Warehouse

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  11. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  12. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  13. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  14. Using Virtual Observatory Tools for Astronomical Research

    NASA Astrophysics Data System (ADS)

    Kim, Sang Chul; Taylor, John D.; Panter, Benjamin; Sohn, Sangmo Tony; Heavens, Alan F.; Mann, Robert G.

    2005-06-01

    Construction of the Virtual Observatory (VO) is a great concern to the astronomical community in the 21st century. We present an outline of the concept and necessity of the VO and the current status of various VO projects including the 15 national ones and the International Virtual Observatory Alliance (IVOA). %, and of Grid project. We summarize the possible science cases that could be solved by using the VO data/tools, real science cases which are the results of using current VO tools, and our own work of using AstroGrid, the United Kingdom national VO, for a research on star formation history of galaxies.

  15. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  16. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    SciTech Connect

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  17. SEACOIN--an investigative tool for biomedical informatics researchers.

    PubMed

    Lee, Eva K; Lee, Hee-Rin; Quarshie, Alexander

    2011-01-01

    Peer-reviewed scientific literature is a prime source for accessing knowledge in the biomedical field. Its rapid growth and diverse domain coverage require systematic efforts in developing interactive tools for efficiently searching and summarizing current advances for acquiring knowledge and referencing, and for furthering scientific discovery. Although information retrieval systems exist, the conventional tools and systems remain difficult for biomedical investigators to use. There remain gaps even in the state-of-the-art systems as little attention has been devoted to understanding the needs of biomedical researchers. Our work attempts to bridge the gap between the needs of biomedical users and systems design efforts. We first study the needs of users and then design a simple visual analytic application tool, SEACOIN. A key motivation stems from biomedical researchers' request for a "simple interface" that is suitable for novice users in information technology. The system minimizes information overload, and allows users to search easily even in time-constrained situations. Users can manipulate the depth of information according to the purpose of usage. SEACOIN enables interactive exploration and filtering of search results via "metamorphose topological visualization" and "tag cloud," visualization tools that are commonly used in social network sites. We illustrate SEACOIN's usage through applications on PubMed publications on heart disease, cancer, Alzheimer's disease, diabetes, and asthma.

  18. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  19. Contemporary outcomes research: tools of the trade.

    PubMed

    Calkins, Casey M

    2008-05-01

    Outcomes are, simply put, why a surgeon comes to work each day. For decades, surgeons have insisted on a regular self-examination of outcomes to ensure the optimal treatment of our patients. Clinical research in pediatric surgery has largely subsisted on outcome analysis as it relates to the rudimentary end-result of an operation, utilizing variables such as mortality, operative time, specific complication rates, and hospital length of stay to name a few. Recently, outcomes research has become a more complex endeavor. This issue of Seminars in Pediatric Surgery addresses a wide array of these newfound complexities in contemporary outcomes research. The purpose of this review is to assist the pediatric surgeon in understanding the tools that are used in contemporary outcomes research and to be able to use this information to ask new questions of our patients and ourselves as we continue to strive for excellence in caring for sick infants and children.

  20. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    SciTech Connect

    Brown, Forrest B.

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  1. Metabolomics, a Powerful Tool for Agricultural Research

    PubMed Central

    Tian, He; Lam, Sin Man; Shui, Guanghou

    2016-01-01

    Metabolomics, which is based mainly on nuclear magnetic resonance (NMR), gas-chromatography (GC) or liquid-chromatography (LC) coupled to mass spectrometry (MS) analytical technologies to systematically acquire the qualitative and quantitative information of low-molecular-mass endogenous metabolites, provides a direct snapshot of the physiological condition in biological samples. As complements to transcriptomics and proteomics, it has played pivotal roles in agricultural and food science research. In this review, we discuss the capacities of NMR, GC/LC-MS in the acquisition of plant metabolome, and address the potential promise and diverse applications of metabolomics, particularly lipidomics, to investigate the responses of Arabidopsis thaliana, a primary plant model for agricultural research, to environmental stressors including heat, freezing, drought, and salinity. PMID:27869667

  2. Operations other than war: Requirements for analysis tools research report

    SciTech Connect

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  3. Group analytic psychotherapy (im)possibilities to research

    PubMed Central

    Vlastelica, Mirela

    2011-01-01

    In the course of group analytic psychotherapy, where we discovered the power of the therapeutic effects, there occurred the need of group analytic psychotherapy researches. Psychotherapeutic work in general, and group psychotherapy in particular, are hard to measure and put into some objective frames. Researches, i. e. measuring of changes in psychotherapy is a complex task, and there are large disagreements. For a long time, the empirical-descriptive method was the only way of research in the field of group psychotherapy. Problems of researches in group psychotherapy in general, and particularly in group analytic psychotherapy can be reviewed as methodology problems at first, especially due to unrepeatability of the therapeutic process. The basic polemics about measuring of changes in psychotherapy is based on the question whether a change is to be measured by means of open measuring of behaviour or whether it should be evaluated more finely by monitoring inner psychological dimensions. Following the therapy results up, besides providing additional information on the patient's improvement, strengthens the psychotherapist's self-respect, as well as his respectability and credibility as a scientist. PMID:25478094

  4. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    their support to MEF staffs with Psychological Operations planning teams in the near future. 2.2.4 Observations on the Situation Context...1998). ―Organizational Consulting: A Gestalt Approach,‖ Cambridge: GIC Press, 23 UNCLASSIFIED Analytical Tools for the Application of Operational...cultural learning and past experiences. What we perceive is often based on our needs, our expectation, our projections, our psychological defenses, and

  5. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    PubMed

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  6. Analytical tools for the analysis of fire debris. A review: 2008-2015.

    PubMed

    Martín-Alberca, Carlos; Ortega-Ojeda, Fernando Ernesto; García-Ruiz, Carmen

    2016-07-20

    The analysis of fire debris evidence might offer crucial information to a forensic investigation, when for instance, there is suspicion of the intentional use of ignitable liquids to initiate a fire. Although the evidence analysis in the laboratory is mainly conducted by a handful of well-established methodologies, during the last eight years several authors proposed noteworthy improvements on these methodologies, suggesting new interesting approaches. This review critically outlines the most up-to-date and suitable tools for the analysis and interpretation of fire debris evidence. The survey about analytical tools covers works published in the 2008-2015 period. It includes sources of consensus-classified reference samples, current standard procedures, new proposals for sample extraction and analysis, and the most novel statistical tools. In addition, this review provides relevant knowledge on the distortion effects of the ignitable liquid chemical fingerprints, which have to be considered during interpretation of results.

  7. Tools and collaborative environments for bioinformatics research

    PubMed Central

    Giugno, Rosalba; Pulvirenti, Alfredo

    2011-01-01

    Advanced research requires intensive interaction among a multitude of actors, often possessing different expertise and usually working at a distance from each other. The field of collaborative research aims to establish suitable models and technologies to properly support these interactions. In this article, we first present the reasons for an interest of Bioinformatics in this context by also suggesting some research domains that could benefit from collaborative research. We then review the principles and some of the most relevant applications of social networking, with a special attention to networks supporting scientific collaboration, by also highlighting some critical issues, such as identification of users and standardization of formats. We then introduce some systems for collaborative document creation, including wiki systems and tools for ontology development, and review some of the most interesting biological wikis. We also review the principles of Collaborative Development Environments for software and show some examples in Bioinformatics. Finally, we present the principles and some examples of Learning Management Systems. In conclusion, we try to devise some of the goals to be achieved in the short term for the exploitation of these technologies. PMID:21984743

  8. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  9. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  10. Interactive Poster: A Proposal for Sharing User Requirements for Visual Analytic Tools

    SciTech Connect

    Scholtz, Jean

    2009-10-11

    Although many in the community have advocated user-centered evaluations for visual analytic environments, a significant barrier exists. The users targeted by the visual analytics community (law enforcement personnel, professional information analysts, financial analysts, health care analysts, etc.) are often inaccessible to researchers. These analysts are extremely busy and their work environments and data are often classified or at least confidential. Furthermore, their tasks often last weeks or even months. It is simply not feasible to do such long-term observations to understand their jobs. How then can we hope to gather enough information about the diverse user populations to understand their needs? Some researchers have been successful in working with different end-users, including the author. A reasonable approach, therefore, would be to find a way to share user information. This paper outlines a proposal for developing a handbook of user profiles for use by researchers, developers, and evaluators.

  11. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  12. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    ERIC Educational Resources Information Center

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  13. Analytical tools for the analysis of β-carotene and its degradation products

    PubMed Central

    Stutz, H.; Bresgen, N.; Eckl, P. M.

    2015-01-01

    Abstract β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method

  14. Analytical tools for the analysis of β-carotene and its degradation products.

    PubMed

    Stutz, H; Bresgen, N; Eckl, P M

    2015-05-01

    β-Carotene, the precursor of vitamin A, possesses pronounced radical scavenging properties. This has centered the attention on β-carotene dietary supplementation in healthcare as well as in the therapy of degenerative disorders and several cancer types. However, two intervention trials with β-carotene have revealed adverse effects on two proband groups, that is, cigarette smokers and asbestos-exposed workers. Beside other causative reasons, the detrimental effects observed have been related to the oxidation products of β-carotene. Their generation originates in the polyene structure of β-carotene that is beneficial for radical scavenging, but is also prone to oxidation. Depending on the dominant degradation mechanism, bond cleavage might occur either randomly or at defined positions of the conjugated electron system, resulting in a diversity of cleavage products (CPs). Due to their instability and hydrophobicity, the handling of standards and real samples containing β-carotene and related CPs requires preventive measures during specimen preparation, analyte extraction, and final analysis, to avoid artificial degradation and to preserve the initial analyte portfolio. This review critically discusses different preparation strategies of standards and treatment solutions, and also addresses their protection from oxidation. Additionally, in vitro oxidation strategies for the generation of oxidative model compounds are surveyed. Extraction methods are discussed for volatile and non-volatile CPs individually. Gas chromatography (GC), (ultra)high performance liquid chromatography (U)HPLC, and capillary electrochromatography (CEC) are reviewed as analytical tools for final analyte analysis. For identity confirmation of analytes, mass spectrometry (MS) is indispensable, and the appropriate ionization principles are comprehensively discussed. The final sections cover analysis of real samples and aspects of quality assurance, namely matrix effects and method validation.

  15. Capillary electrophoresis as an analytical tool for monitoring nicotine in ATF regulated tobacco products.

    PubMed

    Ralapati, S

    1997-07-18

    Tobacco products are classified at different excise tax rates according to the Code of Federal Regulations. These include cigars, cigarettes, pipe tobacco, roll-your-own tobacco, chewing tobacco and snuff. Nicotine is the primary determinant of what constitutes a tobacco product from a regulatory standpoint. Determination of nicotine, therefore, is of primary importance and interest to ATF. Since nicotine is also the most abundant alkaloid found in tobacco, comprising about 98% of the total alkaloid content, a rapid method for the determination of nicotine in ATF regulated products is desirable. Capillary electrophoresis (CE), as an analytical technique, is rapidly gaining importance capturing the interest of analysts in several areas. The unique and powerful capabilities of CE including high resolution and short analysis times, make it a powerful analytical tool in the regulatory area as well. Preliminary studies using a 25 mM sodium phosphate buffer, pH 2.5 at 260 nm have yielded promising results for the analysis of nicotine in tobacco products. Application of an analytical method for the determination of nicotine by CE to ATF regulated tobacco products will be presented.

  16. Some Tooling for Manufacturing Research Reactor Fuel Plates

    SciTech Connect

    Knight, R.W.

    1999-10-03

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment.

  17. VAO Tools Enhance CANDELS Research Productivity

    NASA Astrophysics Data System (ADS)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  18. New research tools for urogenital schistosomiasis.

    PubMed

    Rinaldi, Gabriel; Young, Neil D; Honeycutt, Jared D; Brindley, Paul J; Gasser, Robin B; Hsieh, Michael H

    2015-03-15

    Approximately 200,000,000 people have schistosomiasis (schistosome infection). Among the schistosomes, Schistosoma haematobium is responsible for the most infections, which are present in 110 million people globally, mostly in sub-Saharan Africa. This pathogen causes an astonishing breadth of sequelae: hematuria, anemia, dysuria, stunting, uremia, bladder cancer, urosepsis, and human immunodeficiency virus coinfection. Refined estimates of the impact of schistosomiasis on quality of life suggest that it rivals malaria. Despite S. haematobium's importance, relevant research has lagged. Here, we review advances that will deepen knowledge of S. haematobium. Three sets of breakthroughs will accelerate discoveries in the pathogenesis of urogenital schistosomiasis (UGS): (1) comparative genomics, (2) the development of functional genomic tools, and (3) the use of animal models to explore S. haematobium-host interactions. Comparative genomics for S. haematobium is feasible, given the sequencing of multiple schistosome genomes. Features of the S. haematobium genome that are conserved among platyhelminth species and others that are unique to S. haematobium may provide novel diagnostic and drug targets for UGS. Although there are technical hurdles, the integrated use of these approaches can elucidate host-pathogen interactions during this infection and can inform the development of techniques for investigating schistosomes in their human and snail hosts and the development of therapeutics and vaccines for the control of UGS.

  19. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  20. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  1. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  2. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    ERIC Educational Resources Information Center

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  3. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets

    PubMed Central

    Angulo, Diego A.; Schneider, Cyril; Oliver, James H.; Charpak, Nathalie; Hernandez, Jose T.

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data. PMID:27601990

  4. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    PubMed

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  5. Streamlining Research by Using Existing Tools

    PubMed Central

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and multi-site collaborations from scratch—reinventing the wheel. Our team developed a compendium of resources to address inefficiencies and researchers’ unmet needs and compiled them in a research toolkit website (www.ResearchToolkit.org). Through our work, we identified philosophical and operational issues related to disseminating the toolkit to the research community. We explore these issues here, with implications for the nation’s investment in biomedical research. PMID:21884513

  6. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    PubMed

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  7. Sociometry: Tools for Research and Practice.

    ERIC Educational Resources Information Center

    Treadwell, Thomas W.; Kumar, V. K.; Stein, Steven A.; Prosnick, Kevin

    1997-01-01

    Reviews basic sociometric tools and their analysis, provides information on computer programs to analyze sociometric data, and briefly examines considerations in conducting sociometric investigations. Looks at the social atom (significant others), constructing sociometry questions, and offers an analysis of individual status and interactional…

  8. Microfluidic tools for cell biological research

    PubMed Central

    Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.

    2010-01-01

    Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269

  9. Tools for Ephemeral Gully Erosion Process Research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  10. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    SciTech Connect

    Bjoerklund, Anna

    2012-01-15

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: Black-Right-Pointing-Pointer LCA was explored as analytical tool in an SEA process of municipal energy planning. Black-Right-Pointing-Pointer The process also integrated LCA with scenario planning and public participation. Black-Right-Pointing-Pointer Benefits of using LCA were a systematic framework and wider systems perspective. Black-Right-Pointing-Pointer Integration of tools required some methodological challenges to be solved. Black-Right-Pointing-Pointer This proved an innovative approach to define alternatives and scope of assessment.

  11. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives.

  12. The Child Diary as a Research Tool

    ERIC Educational Resources Information Center

    Lamsa, Tiina; Ronka, Anna; Poikonen, Pirjo-Liisa; Malinen, Kaisa

    2012-01-01

    The aim of this article is to introduce the use of the child diary as a method in daily diary research. By describing the research process and detailing its structure, a child diary, a structured booklet in which children's parents and day-care personnel (N = 54 children) reported their observations, was evaluated. The participants reported the…

  13. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    ERIC Educational Resources Information Center

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  14. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  15. Database tools in genetic diseases research.

    PubMed

    Bianco, Anna Monica; Marcuzzi, Annalisa; Zanin, Valentina; Girardelli, Martina; Vuch, Josef; Crovella, Sergio

    2013-02-01

    The knowledge of the human genome is in continuous progression: a large number of databases have been developed to make meaningful connections among worldwide scientific discoveries. This paper reviews bioinformatics resources and database tools specialized in disseminating information regarding genetic disorders. The databases described are useful for managing sample sequences, gene expression and post-transcriptional regulation. In relation to data sets available from genome-wide association studies, we describe databases that could be the starting point for developing studies in the field of complex diseases, particularly those in which the causal genes are difficult to identify.

  16. Development of rocket electrophoresis technique as an analytical tool in preformulation study of tetanus vaccine formulation.

    PubMed

    Ahire, V J; Sawant, K K

    2006-08-01

    Rocket Electrophoresis (RE) technique relies on the difference in charges of the antigen and antibodies at the selected pH. The present study involves optimization of RE run conditions for Tetanus Toxoid (TT). Agarose gel (1% w/v, 20 ml, pH 8.6), anti-TT IgG - 1 IU/ml, temperature 4-8 degrees C and run duration of 18 h was found to be optimum. Height of the rocket-shaped precipitate was proportional to TT concentration. The RE method was found to be linear in the concentration range of 2.5 to 30 Lf/mL. The method was validated and found to be accurate, precise, and reproducible when analyzed statistically using student's t-test. RE was used as an analytical method for analyzing TT content in plain and marketed formulations as well as for the preformulation study of vaccine formulation where formulation additives were tested for compatibility with TT. The optimized RE method has several advantages: it uses safe materials, is inexpensive, and easy to perform. RE results are less prone to operator's bias as compared to flocculation test and can be documented by taking photographs and scanned by densitometer; RE can be easily standardized for the required antigen concentration by changing antitoxin concentration. It can be used as a very effective tool for qualitative and quantitative analysis and in preformulation studies of antigens.

  17. Common plants as alternative analytical tools to monitor heavy metals in soil

    PubMed Central

    2012-01-01

    Background Herbaceous plants are common vegetal species generally exposed, for a limited period of time, to bioavailable environmental pollutants. Heavy metals contamination is the most common form of environmental pollution. Herbaceous plants have never been used as natural bioindicators of environmental pollution, in particular to monitor the amount of heavy metals in soil. In this study, we aimed at assessing the usefulness of using three herbaceous plants (Plantago major L., Taraxacum officinale L. and Urtica dioica L.) and one leguminous (Trifolium pratense L.) as alternative indicators to evaluate soil pollution by heavy metals. Results We employed Inductively Coupled Plasma Atomic Emission Spectroscopy (ICP-AES) to assess the concentration of selected heavy metals (Cu, Zn, Mn, Pb, Cr and Pd) in soil and plants and we employed statistical analyses to describe the linear correlation between the accumulation of some heavy metals and selected vegetal species. We found that the leaves of Taraxacum officinale L. and Trifolium pratense L. can accumulate Cu in a linearly dependent manner with Urtica dioica L. representing the vegetal species accumulating the highest fraction of Pb. Conclusions In this study we demonstrated that common plants can be used as an alternative analytical tool for monitoring selected heavy metals in soil. PMID:22594441

  18. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    PubMed

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  19. Introducing diffusing wave spectroscopy as a process analytical tool for pharmaceutical emulsion manufacturing.

    PubMed

    Reufer, Mathias; Machado, Alexandra H E; Niederquell, Andreas; Bohnenblust, Katharina; Müller, Beat; Völker, Andreas Charles; Kuentz, Martin

    2014-12-01

    Emulsions are widely used for pharmaceutical, food, and cosmetic applications. To guarantee that their critical quality attributes meet specifications, it is desirable to monitor the emulsion manufacturing process. However, finding of a suitable process analyzer has so far remained challenging. This article introduces diffusing wave spectroscopy (DWS) as an at-line technique to follow the manufacturing process of a model oil-in-water pharmaceutical emulsion containing xanthan gum. The DWS results were complemented with mechanical rheology, microscopy analysis, and stability tests. DWS is an advanced light scattering technique that assesses the microrheology and in general provides information on the dynamics and statics of dispersions. The obtained microrheology results showed good agreement with those obtained with bulk rheology. Although no notable changes in the rheological behavior of the model emulsions were observed during homogenization, the intensity correlation function provided qualitative information on the evolution of the emulsion dynamics. These data together with static measurements of the transport mean free path (l*) correlated very well with the changes in droplet size distribution occurring during the emulsion homogenization. This study shows that DWS is a promising process analytical technology tool for development and manufacturing of pharmaceutical emulsions.

  20. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire Prevention Equivalent Level of Safety Analysis § 102-80.120 What analytical and empirical tools should be... used as part of an analysis, an assessment of the predictive capabilities of the fire models must...

  1. Visualization tools for comprehensive test ban treaty research

    SciTech Connect

    Edwards, T.L.; Harris, J.M.; Simons, R.W.

    1997-08-01

    This paper focuses on tools used in Data Visualization efforts at Sandia National Laboratories under the Department of Energy CTBT R&D program. These tools provide interactive techniques for the examination and interpretation of scientific data, and can be used for many types of CTBT research and development projects. We will discuss the benefits and drawbacks of using the tools to display and analyze CTBT scientific data. While the tools may be used for everyday applications, our discussion will focus on the use of these tools for visualization of data used in research and verification of new theories. Our examples focus on uses with seismic data, but the tools may also be used for other types of data sets. 5 refs., 6 figs., 1 tab.

  2. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    SciTech Connect

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O'Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review the areas of

  3. Research-Based Communication Tool Kit

    ERIC Educational Resources Information Center

    Brown, Sherry; Campbell-Zopf, Mary; Hooper, Jeffrey; Marshall, David; McLaughlin, Beck

    2007-01-01

    Significant research over the last decade has built a strong case for the value of arts learning. Major summaries, including "Schools, Communities, and the Arts" (1995); "Champions of Change" (2000); "The Arts in Education: Evaluating the Evidence for a Causal Link" (2000); "Critical Links" (2002); and now "Critical Evidence: How the Arts Benefit…

  4. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  5. Sea otter research methods and tools

    USGS Publications Warehouse

    Bodkin, James L.; Maldini, Daniela; Calkins, Donald; Atkinson, Shannon; Meehan, Rosa

    2004-01-01

    Sea otters possess physical characteristics and life history attributes that provide both opportunity and constraint to their study. Because of their relatively limited diving ability they occur in nearshore marine habitats that are usually viewable from shore, allowing direct observation of most behaviors. Because sea otters live nearshore and forage on benthic invertebrates, foraging success and diet are easily measured. Because they rely almost exclusively on their pelage for insulation, which requires frequent grooming, successful application of external tags or instruments has been limited to attachments in the interdigital webbing of the hind flippers. Techniques to surgically implant instruments into the intraperitoneal cavity are well developed and routinely applied. Because they have relatively small home ranges and rest in predictable areas, they can be recaptured with some predictability using closed-circuit scuba diving technology. The purpose of this summary is to identify some of the approaches, methods, and tools that are currently engaged for the study of sea otters, and to suggest potential avenues for applying advancing technologies.

  6. Patenting genome research tools and the law.

    PubMed

    Eisenberg, Rebecca

    2003-01-01

    Patenting genes encoding therapeutic proteins was relatively uncontroversial in the early days of biotechnology. Controversy arose in the era of high-throughput DNA sequencing, when gene patents started to look less like patents on drugs and more like patents on scientific information. Evolving scientific and business strategies for exploiting genomic information raised concerns that patents might slow subsequent research. The trend towards stricter enforcement of the utility and disclosure requirements by the patent offices should help clarify the current confusion.

  7. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  8. Simulation tools for robotics research and assessment

    NASA Astrophysics Data System (ADS)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  9. The role of the automation development group in analytical research and development at Dupont Merck.

    PubMed

    Lynch, J C; Green, J S; Hovsepian, P K; Reilly, K L; Short, J A

    1994-01-01

    Laboratory robotics has been firmly established in many non-QC laboratories as a valuable tool for automating pharmaceutical dosage form analysis. Often a single project or product line is used to justify an initial robot purchase thus introducing robotics to the laboratory for the first time. However, to gain widespread acceptance within the laboratory and to justify further investment in robotics, existing robots must be used to develop analyses for existing manual methods as well as new projects beyond the scope off the original purchase justification. The Automation Development Group in Analytical Research and Development is a team of analysts primarily devoted to developing new methods and adapting existing methods for the robot. This team approach developed the expertise and synergy necessary to significantly expand the contribution of robotics to automation in the authors' laboratory.

  10. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  11. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  12. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    PubMed

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation.

  13. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  14. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    ERIC Educational Resources Information Center

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  15. Innovations in scholarly communication - global survey on research tool usage

    PubMed Central

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents’ demographics included research roles, country of affiliation, research discipline and year of first publication. PMID:27429740

  16. Innovations in scholarly communication - global survey on research tool usage.

    PubMed

    Kramer, Bianca; Bosman, Jeroen

    2016-01-01

    Many new websites and online tools have come into existence to support scholarly communication in all phases of the research workflow. To what extent researchers are using these and more traditional tools has been largely unknown. This 2015-2016 survey aimed to fill that gap. Its results may help decision making by stakeholders supporting researchers and may also help researchers wishing to reflect on their own online workflows. In addition, information on tools usage can inform studies of changing research workflows. The online survey employed an open, non-probability sample. A largely self-selected group of 20663 researchers, librarians, editors, publishers and other groups involved in research took the survey, which was available in seven languages. The survey was open from May 10, 2015 to February 10, 2016. It captured information on tool usage for 17 research activities, stance towards open access and open science, and expectations of the most important development in scholarly communication. Respondents' demographics included research roles, country of affiliation, research discipline and year of first publication.

  17. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    ERIC Educational Resources Information Center

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  18. The use of metacognitive tools in a multidimensional research program

    NASA Astrophysics Data System (ADS)

    Iuli, Richard John

    Metacognition may be thought of as "cognition about cognition", or "thinking about thinking." A number of strategies and tools have been developed to help individuals understand the nature of knowledge, and to enhance their "thinking about thinking." Two metacognitive tools, concept maps and Gowin's Vee, were first developed for use in educational research. Subsequently, they were used successfully to help learners "learn how to learn." The success of metacognitive tools in educational settings suggests that they may help scientists understand the nature of knowledge production and organization, thereby facilitating their research activities and enhancing their understanding of the events and objects they study. In September 1993 I began an ethnographic, naturalistic study of the United States Department of Agriculture - Agricultural Research Service - Rhizobotany Project at Cornell University in Ithaca, NY. I spent the next two and one-half years as a participant observer with the Project. The focus of my research was to examine the application of metacognitive tools to an academic research setting. The knowledge claims that emerged from my research were: (1) Individual researchers tended to have narrow views of the Rhizobotany Project that centered on their individual areas of research; (2) The researchers worked in "conceptual isolation", or failing to see the connections and interrelatedness of their own work with the work of the others; (3) For those researchers who constructed concept maps and Vee diagrams, these heuristics helped them to build a deeper conceptual understanding of their own work; and (4) Half of the members of the research team did not find concept mapping and Vee diagramming useful. Their reluctance to use these tools was interpreted as an indication of epistemological confusion. The prevalence of conceptual isolation and epistemological confusion among members of the Rhizobotany Project parallels the results of previous studies that have

  19. Experimental and Analytical Research on Fracture Processes in ROck

    SciTech Connect

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  20. SAGE Research Methods Datasets: A Data Analysis Educational Tool.

    PubMed

    Vardell, Emily

    2016-01-01

    SAGE Research Methods Datasets (SRMD) is an educational tool designed to offer users the opportunity to obtain hands-on experience with data analysis. Users can search for and browse authentic datasets by method, discipline, and data type. Each of the datasets are supplemented with educational material on the research method and clear guidelines for how to approach data analysis.

  1. Practitioner-Oriented Research as a Tool for Professional Development

    ERIC Educational Resources Information Center

    Johansson, Inge; Sandberg, Anette; Vuorinen, Tuula

    2007-01-01

    The aim of this study was to analyse how a model for practitioner-oriented research can be used as a tool for professional development in the preschool. The focus of interest is the type of knowledge that is formed when researchers and preschool staff cooperate on local projects, and what this new knowledge means for the images of professional…

  2. The WWW Cabinet of Curiosities: A Serendipitous Research Tool

    ERIC Educational Resources Information Center

    Arnold, Josie

    2012-01-01

    This paper proposes that the WWW is able to be fruitfully understood as a research tool when we utilise the metaphor of the cabinet of curiosities, the wunderkammer. It unpeels some of the research attributes of the metaphor as it reveals the multiplicity of connectivity on the web that provides serendipitous interactions between unexpected…

  3. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  4. Spec Tool; an online education and research resource

    NASA Astrophysics Data System (ADS)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  5. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    PubMed Central

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  6. Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems

    PubMed Central

    Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-01-01

    Background Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. Objective The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Methods Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Results Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Conclusions Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the

  7. Analytics for Cyber Network Defense

    SciTech Connect

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  8. Identifying and Tracing Persistent Identifiers of Research Resources : Automation, Metrics and Analytics

    NASA Astrophysics Data System (ADS)

    Maull, K. E.; Hart, D.; Mayernik, M. S.

    2015-12-01

    Formal and informal citations and acknowledgements for research infrastructures, such as data collections, software packages, and facilities, are an increasingly important function of attribution in scholarly literature. While such citations provide the appropriate links, even if informally, to their origins, they are often done so inconsistently, making such citations hard to analyze. While significant progress has been made in the past few years in the development of recommendations, policies, and procedures for creating and promoting citable identifiers, progress has been mixed in tracking how data sets and other digital infrastructures have actually been identified and cited in the literature. Understanding the full extent and value of research infrastructures through the lens of scholarly literature requires significant resources, and thus, we argue must rely on automated approaches that mine and track persistent identifiers to scientific resources. Such automated approaches, however, face a number of unique challenges, from the inconsistent and informal referencing practices of authors, to unavailable, embargoed or hard-to-obtain full-text resources for text analytics, to inconsistent and capricious impact metrics. This presentation will discuss work to develop and evaluate tools for automating the tracing of research resource identification and referencing in the research literature via persistent citable identifiers. Despite the impediments, automated processes are of considerable importance in enabling these traceability efforts to scale, as the numbers of identifiers being created for unique scientific resources continues to grow rapidly. Such efforts, if successful, should improve the ability to answer meaningful questions about research resources as they continue to grow as a target of advanced analyses in research metrics.

  9. The Global War on Terrorism: Analytical Support, Tools and Metrics of Assessment. MORS Workshop

    DTIC Science & Technology

    2005-08-11

    Metrics of Assessment (Working Group 3) The accompanying Excel workbook contains two worksheets . The first is a Tools versus Questions worksheet and the...emphasis on transnational actors. have similar missions with respect to cri - describing the success or failure to support Academics, US government...sheet tools, GIS; Microsoft Project show great promise "• Encourage MORS Sponsors to contact the various agencies to find out what tools and

  10. Dataset-Driven Research to Support Learning and Knowledge Analytics

    ERIC Educational Resources Information Center

    Verbert, Katrien; Manouselis, Nikos; Drachsler, Hendrik; Duval, Erik

    2012-01-01

    In various research areas, the availability of open datasets is considered as key for research and application purposes. These datasets are used as benchmarks to develop new algorithms and to compare them to other algorithms in given settings. Finding such available datasets for experimentation can be a challenging task in technology enhanced…

  11. The efficacy of violence prediction: a meta-analytic comparison of nine risk assessment tools.

    PubMed

    Yang, Min; Wong, Stephen C P; Coid, Jeremy

    2010-09-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their predictive efficacies for violence. The effect sizes were extracted from 28 original reports published between 1999 and 2008, which assessed the predictive accuracy of more than one tool. We used a within-subject design to improve statistical power and multilevel regression models to disentangle random effects of variation between studies and tools and to adjust for study features. All 9 tools and their subscales predicted violence at about the same moderate level of predictive efficacy with the exception of Psychopathy Checklist--Revised (PCL-R) Factor 1, which predicted violence only at chance level among men. Approximately 25% of the total variance was due to differences between tools, whereas approximately 85% of heterogeneity between studies was explained by methodological features (age, length of follow-up, different types of violent outcome, sex, and sex-related interactions). Sex-differentiated efficacy was found for a small number of the tools. If the intention is only to predict future violence, then the 9 tools are essentially interchangeable; the selection of which tool to use in practice should depend on what other functions the tool can perform rather than on its efficacy in predicting violence. The moderate level of predictive accuracy of these tools suggests that they should not be used solely for some criminal justice decision making that requires a very high level of accuracy such as preventive detention.

  12. Complex source beam: A tool to describe highly focused vector beams analytically

    SciTech Connect

    Orlov, S.; Peschel, U.

    2010-12-15

    The scalar-complex-source model is used to develop an accurate description of highly focused radially, azimuthally, linearly, and circularly polarized monochromatic vector beams. We investigate the power and full beam widths at half maximum of vigorous Maxwell equation solutions. The analytical expressions are employed to compare the vector complex source beams with the real beams produced by various high-numerical-aperture (NA) focusing systems. We find a parameter set for which the spatial extents of the analytical beams are the same as those of experimentally realized ones. We ensure the same shape of the considered beams by investigating an overlap of the complex source beams with high-NA beams. We demonstrate that the analytical expressions are good approximations for realistic highly focused beams.

  13. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    ERIC Educational Resources Information Center

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  14. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    ERIC Educational Resources Information Center

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  15. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2011-09-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  16. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Bauer, G.; Dörnbrack, A.

    2012-01-01

    We present a web service based tool for the planning of atmospheric research flights. The tool provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard, a technology that has gained increased attention in meteorology in recent years. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. We have implemented the software using the open-source programming language Python. In the present article, we describe the architecture of the tool. As an example application, we discuss a case study research flight planned for the scenario of the 2010 Eyjafjalla volcano eruption. Usage and implementation details are provided as Supplement.

  17. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  18. Tracking and Visualizing Student Effort: Evolution of a Practical Analytics Tool for Staff and Student Engagement

    ERIC Educational Resources Information Center

    Nagy, Robin

    2016-01-01

    There is an urgent need for our educational system to shift assessment regimes from a narrow, high-stakes focus on grades, to more holistic definitions that value the qualities that lifelong learners will need. The challenge for learning analytics in this context is to deliver actionable assessments of these hard-to-quantify qualities, valued by…

  19. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    in database and data warehousing, data mining and machine learning, risk analysis and optimization, as well as applied analytics. Practitioners...analyzing historical time series data to provide insights regarding future decisions. • Data mining – which involves mining transactional data bases...reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of

  20. Evaluating the Performance of Calculus Classes Using Operational Research Tools.

    ERIC Educational Resources Information Center

    Soares de Mello, Joao Carlos C. B.; Lins, Marcos P. E.; Soares de Mello, Maria Helena C.; Gomes, Eliane G.

    2002-01-01

    Compares the efficiency of calculus classes and evaluates two kinds of classes: traditional and others that use computational methods in teaching. Applies quantitative evaluation methods using two operational research tools, multicriteria decision aid methods (mainly using the MACBETH approach) and data development analysis. (Author/YDS)

  1. Measurement and Research Tools. Symposium 37. [AHRD Conference, 2001].

    ERIC Educational Resources Information Center

    2001

    This symposium on measurement and research tools consists of three presentations. "An Examination of the Multiple Intelligences Developmental Assessment Scales (MIDAS)" (Albert Wiswell et al.) explores MIDAS's psychometric saliency. Findings indicates this instrument represents an incomplete attempt to develop a valid assessment of…

  2. Analyzing Online Teacher Networks: Cyber Networks Require Cyber Research Tools

    ERIC Educational Resources Information Center

    Schlager, Mark S.; Farooq, Umer; Fusco, Judith; Schank, Patricia; Dwyer, Nathan

    2009-01-01

    The authors argue that conceptual and methodological limitations in existing research approaches severely hamper theory building and empirical exploration of teacher learning and collaboration through cyber-enabled networks. They conclude that new frameworks, tools, and techniques are needed to understand and maximize the benefits of teacher…

  3. Narrative Inquiry: Research Tool and Medium for Professional Development.

    ERIC Educational Resources Information Center

    Conle, Carola

    2000-01-01

    Describes the development of narrative inquiry, highlighting one institutional setting, and discussing how narrative inquiry moved from being a research tool to a vehicle for curriculum within both graduate and preservice teacher development. After discussing theoretical resources for narrative inquiry, the paper examines criteria and terms…

  4. A clinical research analytics toolkit for cohort study.

    PubMed

    Yu, Yiqin; Zhu, Yu; Sun, Xingzhi; Tao, Ying; Zhang, Shuo; Xu, Linhao; Pan, Yue

    2012-01-01

    This paper presents a clinical informatics toolkit that can assist physicians to conduct cohort studies effectively and efficiently. The toolkit has three key features: 1) support of procedures defined in epidemiology, 2) recommendation of statistical methods in data analysis, and 3) automatic generation of research reports. On one hand, our system can help physicians control research quality by leveraging the integrated knowledge of epidemiology and medical statistics; on the other hand, it can improve productivity by reducing the complexities for physicians during their cohort studies.

  5. Research for research: tools for knowledge discovery and visualization.

    PubMed Central

    Van Mulligen, Erik M.; Van Der Eijk, Christiaan; Kors, Jan A.; Schijvenaars, Bob J. A.; Mons, Barend

    2002-01-01

    This paper describes a method to construct from a set of documents a spatial representation that can be used for information retrieval and knowledge discovery. The proposed method has been implemented in a prototype system and allows the researcher to browse interactively and in real-time a network of relationships obtained from a set of full text articles. These relationships are combined with the potential relationships between concepts as defined in the UMLS semantic network. The browser allows the user to select a seed term and find all related concepts, to find a path between concepts (hypothesis testing), and to retrieve the references to documents or database entries that support the relationship between concepts. PMID:12463942

  6. Adsorptive micro-extraction techniques--novel analytical tools for trace levels of polar solutes in aqueous media.

    PubMed

    Neng, N R; Silva, A R M; Nogueira, J M F

    2010-11-19

    A novel enrichment technique, adsorptive μ-extraction (AμE), is proposed for trace analysis of polar solutes in aqueous media. The preparation, stability tests and development of the analytical devices using two geometrical configurations, i.e. bar adsorptive μ-extraction (BAμE) and multi-spheres adsorptive μ-extraction (MSAμE) is fully discussed. From the several sorbent materials tested, activated carbons and polystyrene divinylbenzene phases demonstrated the best stability, robustness and to be the most suitable for analytical purposes. The application of both BAμE and MSAμE devices proved remarkable performance for the determination of trace levels of polar solutes and metabolites (e.g. pesticides, disinfection by-products, drugs of abuse and pharmaceuticals) in water matrices and biological fluids. By comparing AμE techniques with stir bar sorptive extraction based on polydimethylsiloxane phase, great effectiveness is attained overcoming the limitations of the latter enrichment approach regarding the more polar solutes. Furthermore, convenient sensitivity and selectivity is reached through AμE techniques, since the great advantage of this new analytical technology is the possibility to choose the most suitable sorbent to each particular type of application. The enrichment techniques proposed are cost-effective, easy to prepare and work-up, demonstrating robustness and to be a remarkable analytical tool for trace analysis of priority solutes in areas of recognized importance such as environment, forensic and other related life sciences.

  7. Single Subject Research: A Synthesis of Analytic Methods

    ERIC Educational Resources Information Center

    Alresheed, Fahad; Hott, Brittany L.; Bano, Carmen

    2013-01-01

    Historically, the synthesis of single subject design has employed visual inspection to yield significance of results. However, current research is supporting different techniques that will facilitate the interpretation of these intervention outcomes. These methods can provide more reliable data than employing visual inspection in isolation. This…

  8. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    PubMed

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  9. Redundancy in neutron activation analysis: A valuable tool in assuring analytical quality

    SciTech Connect

    Greenberg, R.R.

    1996-12-31

    Neutron activation analysis (NAA) has become widely used and is extremely valuable for the certification of standard reference materials (SRMs) at the National Institute of Standards and Technology (NIST). This is due to a number of reasons. First, NAA has essentially no significant sources of error in common with the other analytical techniques used at NIST to measure inorganic concentrations. This is important because most certified elemental concentrations are derived from the data determined by two (and occasionally more) independent analytical techniques. Two or more techniques are used for SRM certification because, although each technique has previously been evaluated and shown to be accurate, unexpected problems can arise, especially when analyzing new matrices. Another reason for the use of NAA for SRM certification is the potential of this technique for accuracy. The SRM measurements with estimated accuracies of 1 to 2% (at essentially 95% confidence intervals) are routinely made at NIST using NAA.

  10. Echocardiography as a Research and Clinical Tool in Veterinary Medicine

    PubMed Central

    Allen, D. G.

    1982-01-01

    Echocardiography is the accepted term for the study of cardiac ultrasound. Although a relatively new tool for the study of the heart in man it has already found wide acceptance in the area of cardiac research and in the study of clinical cardiac disease. Animals had often been used in the early experiments with cardiac ultrasound, but only recently has echocardiography been used as a research and clinical tool in veterinary medicine. In this report echocardiography is used in the research of anesthetic effects on ventricular function and clinically in the diagnosis of congestive cardiomyopathy in a cat, ventricular septal defect in a calf, and pericardial effusion in a dog. Echocardiography is now an important adjunct to the field of veterinary cardiology. ImagesFigure 7.Figure 8.Figure 9.Figure 10. PMID:17422196

  11. Analytical and scale model research aimed at improved hangglider design

    NASA Technical Reports Server (NTRS)

    Kroo, I.; Chang, L. S.

    1979-01-01

    Research consisted of a theoretical analysis which attempts to predict aerodynamic characteristics using lifting surface theory and finite-element structural analysis as well as an experimental investigation using 1/5 scale elastically similar models in the NASA Ames 2m x 3m (7' x 10') wind tunnel. Experimental data were compared with theoretical results in the development of a computer program which may be used in the design and evaluation of ultralight gliders.

  12. DNA-only cascade: a universal tool for signal amplification, enhancing the detection of target analytes.

    PubMed

    Bone, Simon M; Hasick, Nicole J; Lima, Nicole E; Erskine, Simon M; Mokany, Elisa; Todd, Alison V

    2014-09-16

    Diagnostic tests performed in the field or at the site of patient care would benefit from using a combination of inexpensive, stable chemical reagents and simple instrumentation. Here, we have developed a universal "DNA-only Cascade" (DoC) to quantitatively detect target analytes with increased speed. The DoC utilizes quasi-circular structures consisting of temporarily inactivated deoxyribozymes (DNAzymes). The catalytic activity of the DNAzymes is restored in a universal manner in response to a broad range of environmental and biological targets. The present study demonstrates DNAzyme activation in the presence of metal ions (Pb(2+)), small molecules (deoxyadenosine triphosphate) and nucleic acids homologous to genes from Meningitis-causing bacteria. Furthermore, DoC efficiently discriminates nucleic acid targets differing by a single nucleotide. When detection of analytes is orchestrated by functional nucleic acids, the inclusion of DoC reagents substantially decreases time for detection and allows analyte quantification. The detection of nucleic acids using DoC was further characterized for its capability to be multiplexed and retain its functionality following long-term exposure to ambient temperatures and in a background of complex medium (human serum).

  13. Technical phosphoproteomic and bioinformatic tools useful in cancer research

    PubMed Central

    2011-01-01

    Reversible protein phosphorylation is one of the most important forms of cellular regulation. Thus, phosphoproteomic analysis of protein phosphorylation in cells is a powerful tool to evaluate cell functional status. The importance of protein kinase-regulated signal transduction pathways in human cancer has led to the development of drugs that inhibit protein kinases at the apex or intermediary levels of these pathways. Phosphoproteomic analysis of these signalling pathways will provide important insights for operation and connectivity of these pathways to facilitate identification of the best targets for cancer therapies. Enrichment of phosphorylated proteins or peptides from tissue or bodily fluid samples is required. The application of technologies such as phosphoenrichments, mass spectrometry (MS) coupled to bioinformatics tools is crucial for the identification and quantification of protein phosphorylation sites for advancing in such relevant clinical research. A combination of different phosphopeptide enrichments, quantitative techniques and bioinformatic tools is necessary to achieve good phospho-regulation data and good structural analysis of protein studies. The current and most useful proteomics and bioinformatics techniques will be explained with research examples. Our aim in this article is to be helpful for cancer research via detailing proteomics and bioinformatic tools. PMID:21967744

  14. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    PubMed Central

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  15. The use of Permeation Liquid Membrane (PLM) as an analytical tool for trace metal speciation studies in natural waters

    NASA Astrophysics Data System (ADS)

    Parthasarathy, N.; Pelletier, M.; Buffle, J.

    2003-05-01

    Permeation liquid membrane (PLM) based on liquid-liquid extraction principles is an emerging analytical tool for making in situ trace metal speciation measurements. A PLM comprising didecyl 1, 10 diaza crown etherlauric acid in phenylhexane/toluene has been developed for measuring free metal ions (e.g. Cu, Pb, Cd and Zn) concentration under natural water conditions. The capability of PLM for making speciation studies has been demonstrated using synthetic and natural ligands. Application of in situ preconcentration of trace metals in diverse waters using specially designed hollow fibre PLM are reported.

  16. The use of analytical surface tools in the fundamental study of wear. [atomic nature of wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1977-01-01

    Various techniques and surface tools available for the study of the atomic nature of the wear of materials are reviewed These include chemical etching, x-ray diffraction, electron diffraction, scanning electron microscopy, low-energy electron diffraction, Auger emission spectroscopy analysis, electron spectroscopy for chemical analysis, field ion microscopy, and the atom probe. Properties of the surface and wear surface regions which affect wear, such as surface energy, crystal structure, crystallographic orientation, mode of dislocation behavior, and cohesive binding, are discussed. A number of mechanisms involved in the generation of wear particles are identified with the aid of the aforementioned tools.

  17. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    PubMed

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development.

  18. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  19. Microfluidics as a tool for C. elegans research.

    PubMed

    San-Miguel, Adriana; Lu, Hang

    2013-09-24

    Microfluidics has emerged as a set of powerful tools that have greatly advanced some areas of biological research, including research using C. elegans. The use of microfluidics has enabled many experiments that are otherwise impossible with conventional methods. Today there are many examples that demonstrate the main advantages of using microfluidics for C. elegans research, achieving precise environmental conditions and facilitating worm handling. Examples range from behavioral analysis under precise chemical or odor stimulation, locomotion studies in well-defined structural surroundings, and even long-term culture on chip. Moreover, microfluidics has enabled coupling worm handling and imaging thus facilitating genetic screens, optogenetic studies, and laser ablation experiments. In this article, we review some of the applications of microfluidics for C. elegans research and provide guides for the design, fabrication, and use of microfluidic devices for C. elegans research studies.

  20. Petasites hybridus: a tool for interdisciplinary research in phytotherapy.

    PubMed

    Debrunner, B; Meier, B

    1998-02-01

    The 3rd Petasites gathering took place in Romanshorn, Switzerland on March 29, 1996 and gave 16 European scientists the opportunity to transmit their latest considerable discoveries to interested researchers working in different scientific disciplines such as pharmacognosy, botany, chemistry, pharmacology, medicine or clinical pharmacy. The newest findings on Petasites hybridus as a significant plant drug showed very promising aspects of therapeutic utility. Great progress has been made in chemical analytical methods and the determination of pharmacological activities. Substantial advances have also occurred in the production of bioassay procedures and plant materials, particularly utilizing cell- and tissue-culture techniques.

  1. Analytic model for academic research productivity having factors, interactions and implications

    PubMed Central

    2011-01-01

    Financial support is dear in academia and will tighten further. How can the research mission be accomplished within new restraints? A model is presented for evaluating source components of academic research productivity. It comprises six factors: funding; investigator quality; efficiency of the research institution; the research mix of novelty, incremental advancement, and confirmatory studies; analytic accuracy; and passion. Their interactions produce output and patterned influences between factors. Strategies for optimizing output are enabled. PMID:22130145

  2. ICL-Based OF-CEAS: A Sensitive Tool for Analytical Chemistry.

    PubMed

    Manfred, Katherine M; Hunter, Katharine M; Ciaffoni, Luca; Ritchie, Grant A D

    2017-01-03

    Optical-feedback cavity-enhanced absorption spectroscopy (OF-CEAS) using mid-infrared interband cascade lasers (ICLs) is a sensitive technique for trace gas sensing. The setup of a V-shaped optical cavity operating with a 3.29 μm cw ICL is detailed, and a quantitative characterization of the injection efficiency, locking stability, mode matching, and detection sensitivity is presented. The experimental data are supported by a model to show how optical feedback affects the laser frequency as it is scanned across several longitudinal modes of the optical cavity. The model predicts that feedback enhancement effects under strongly absorbing conditions can cause underestimations in the measured absorption, and these predictions are verified experimentally. The technique is then used in application to the detection of nitrous oxide as an exemplar of the utility of this technique for analytical gas phase spectroscopy. The analytical performance of the spectrometer, expressed as noise equivalent absorption coefficient, was estimated as 4.9 × 10(-9) cm (-1) Hz(-1/2), which compares well with recently reported values.

  3. Electrochemical treatment of olive mill wastewater: treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools.

    PubMed

    Belaid, Chokri; Khadraoui, Moncef; Mseddii, Salma; Kallel, Monem; Elleuch, Boubaker; Fauvarque, Jean Frangois

    2013-01-01

    Problems related with industrials effluents can be divided in two parts: (1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes. This investigation deals with these two aspects, an electrochemical treatment method of an olive mill wastewater (OMW) under platinized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination. The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution. Indeed, 87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation. Moreover, 55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced. On the other hand, UV-Visible spectrophotometry, Gaz chromatography/mass spectrometry, cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR) showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW. It was concluded that electrochemical oxidation in a modified Grignard reactor is a promising process for the destruction of all phenolic compounds present in OMW. Among the monitoring analytical tools applied, cyclic voltammetry and 13C NMR a re among th e techniques that are introduced for thefirst time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  4. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  5. [Study monitoring: a useful tool for quality health research].

    PubMed

    Arias Valencia, Samuel Andrés; Hernández Pinzón, Giovanna

    2009-05-01

    As well as protecting the rights of participants, a study's ethics must encompass the quality of its execution. As such, international standards have been established for studies involving human subjects. The objective of this review is to evaluate the usefulness of the Guide to Good Clinical Practice and "study monitoring" as tools useful to producing quality research. The Guide provides scientific ethics and quality standards for designing, conducting, registering, and notifying studies involving human subjects. By implementing specific processes and procedures, study monitoring seeks to ensure that research is followed and evaluated from inception, through execution and closure, thus producing studies with high quality standards.

  6. ELISA and GC-MS as Teaching Tools in the Undergraduate Environmental Analytical Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Wilson, Ruth I.; Mathers, Dan T.; Mabury, Scott A.; Jorgensen, Greg M.

    2000-12-01

    An undergraduate experiment for the analysis of potential water pollutants is described. Students are exposed to two complementary techniques, ELISA and GC-MS, for the analysis of a water sample containing atrazine, desethylatrazine, and simazine. Atrazine was chosen as the target analyte because of its wide usage in North America and its utility for students to predict environmental degradation products. The water sample is concentrated using solid-phase extraction for GC-MS, or diluted and analyzed using a competitive ELISA test kit for atrazine. The nature of the water sample is such that students generally find that ELISA gives an artificially high value for the concentration of atrazine. Students gain an appreciation for problems associated with measuring pollutants in the aqueous environment: sensitivity, accuracy, precision, and ease of analysis. This undergraduate laboratory provides an opportunity for students to learn several new analysis and sample preparation techniques and to critically evaluate these methods in terms of when they are most useful.

  7. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    PubMed

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  8. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    ERIC Educational Resources Information Center

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  9. FOSS Tools for Research Infrastructures - A Success Story?

    NASA Astrophysics Data System (ADS)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  10. A web service based tool to plan atmospheric research flights

    NASA Astrophysics Data System (ADS)

    Rautenhaus, M.; Dörnbrack, A.

    2012-04-01

    We present a web service based tool for the planning of atmospheric research flights. The tool, which we call the "Mission Support System" (MSS), provides online access to horizontal maps and vertical cross-sections of numerical weather prediction data and in particular allows the interactive design of a flight route in direct relation to the predictions. It thereby fills a crucial gap in the set of currently available tools for using data from numerical atmospheric models for research flight planning. A distinct feature of the tool is its lightweight, web service based architecture, requiring only commodity hardware and a basic Internet connection for deployment. Access to visualisations of prediction data is achieved by using an extended version of the Open Geospatial Consortium Web Map Service (WMS) standard. With the WMS approach, we avoid the transfer of large forecast model output datasets while enabling on-demand generated visualisations of the predictions at campaign sites with limited Internet bandwidth. Usage of the Web Map Service standard also enables access to third-party sources of georeferenced data. The MSS is focused on the primary needs of mission scientists responsible for planning a research flight, addressing in particular the following requirements: (1) interactive exploration of available atmospheric forecasts, (2) interactive flight planning in relation to these forecasts, (3) computation of expected flight performance to assess the technical feasibility (in terms of total distance and vertical profile) of a flight, (4) no transfer of large forecast data files to the campaign site to allow deployment at remote locations and (5) low demand on hardware resources. We have implemented the software using the open-source programming language Python.

  11. Web-based analytical tools for the exploration of spatial data

    NASA Astrophysics Data System (ADS)

    Anselin, Luc; Kim, Yong Wook; Syabri, Ibnu

    This paper deals with the extension of internet-based geographic information systems with functionality for exploratory spatial data analysis (esda). The specific focus is on methods to identify and visualize outliers in maps for rates or proportions. Three sets of methods are included: extreme value maps, smoothed rate maps and the Moran scatterplot. The implementation is carried out by means of a collection of Java classes to extend the Geotools open source mapping software toolkit. The web based spatial analysis tools are illustrated with applications to the study of homicide rates and cancer rates in U.S. counties.

  12. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    PubMed

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  13. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    PubMed Central

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  14. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    PubMed

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  15. [EpiInfo as a research and teaching tool in epidemiology and statistics: strengths and weaknesses].

    PubMed

    Mannocci, Alice; Bontempi, Claudio; Giraldi, Guglielmo; Chiaradia, Giacomina; de Waure, Chiara; Sferrazza, Antonella; Ricciardi, Walter; Boccia, Antonio; La Torre, Giuseppe

    2012-01-01

    EpiInfo is a free software developed in 1988 by the Centers for Disease Control and Prevention (CDC) in Atlanta to facilitate field epidemiological investigations and statistical analysis. The aim of this study was to assess whether the software represents, in the Italian biomedical field, an effective analytical research tool and a practical and simple epidemiology and biostatistics teaching tool. A questionnaire consisting of 20 multiple-choice and open questions was administered to 300 healthcare workers, including doctors, biologists, nurses, medical students and interns, at the end of a CME course in epidemiology and biostatistics. Sixty-four percent of participants were aged between 26 and 45 years, 52% were women and 73% were unmarried. Results show that women are more likely to utilize EpiInfo in their research activities with respect to men (p = 0.023), as are individuals aged 26-45 years with respect to the older and younger age groups (p = 0.023) and unmarried participants with respect to those married (p = 0.010). Thirty-one percent of respondents consider EpiInfo to be more than adequate for analysis of their research data and 52% consider it to be sufficiently so. The inclusion of an EpiInfo course in statistics and epidemiology modules facilitates the understanding of theoretical concepts and allows researchers to more easily perform some of the clinical/epidemiological research activities.

  16. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  17. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools.

    PubMed

    Łojewska, J; Rabin, I; Pawcenis, D; Bagniuk, J; Aksamit-Koperska, M A; Sitarz, M; Missori, M; Krutzsch, M

    2017-04-06

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods.

  18. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    PubMed

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  19. Molecularly imprinted polymers: an analytical tool for the determination of benzimidazole compounds in water samples.

    PubMed

    Cacho, Carmen; Turiel, Esther; Pérez-Conde, Concepción

    2009-05-15

    Molecularly imprinted polymers (MIPs) for benzimidazole compounds have been synthesized by precipitation polymerization using thiabendazole (TBZ) as template, methacrylic acid as functional monomer, ethyleneglycol dimethacrylate (EDMA) and divinylbenzene (DVB) as cross-linkers and a mixture of acetonitrile and toluene as porogen. The experiments carried out by molecularly imprinted solid phase extraction (MISPE) in cartridges demonstrated the imprint effect in both imprinted polymers. MIP-DVB enabled a much higher breakthrough volume than MIP-EDMA, and thus was selected for further experiments. The ability of this MIP for the selective recognition of other benzimidazole compounds (albendazole, benomyl, carbendazim, fenbendazole, flubendazole and fuberidazole) was evaluated. The obtained results revealed the high selectivity of the imprinted polymer towards all the selected benzimidazole compounds. An off-line analytical methodology based on a MISPE procedure has been developed for the determination of benzimidazole compounds in tap, river and well water samples at concentration levels below the legislated maximum concentration levels (MCLs) with quantitative recoveries. Additionally, an on-line preconcentration procedure based on the use of a molecularly imprinted polymer as selective stationary phase in HPLC is proposed as a fast screening method for the evaluation of the presence of benzimidazole compounds in water samples.

  20. Analytical tools employed to determine pharmaceutical compounds in wastewaters after application of advanced oxidation processes.

    PubMed

    Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan

    2016-12-01

    Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.

  1. Recognizing ancient papyri by a combination of spectroscopic, diffractional and chromatographic analytical tools

    PubMed Central

    Łojewska, J.; Rabin, I.; Pawcenis, D.; Bagniuk, J.; Aksamit-Koperska, M. A.; Sitarz, M.; Missori, M.; Krutzsch, M.

    2017-01-01

    Ancient papyri are a written heritage of culture that flourished more than 3000 years ago in Egypt. One of the most significant collections in the world is housed in the Egyptian Museum and Papyrus Collection in Berlin, from where the samples for our investigation come. The papyrologists, curators and conservators of such collections search intensely for the analytical detail that would allow ancient papyri to be distinguished from modern fabrications, in order to detect possible forgeries, assess papyrus deterioration state, and improve the design of storage conditions and conservation methods. This has become the aim of our investigation. The samples were studied by a number of methods, including spectroscopic (FTIR, fluorescent-FS, Raman) diffractional (XRD) and chromatographic (size exclusion chromatography-SEC), selected in order to determine degradation parameters: overall oxidation of lignocellulosic material, degree of polymerization and crystallinity of cellulose. The results were correlated with those obtained from carefully selected model samples including modern papyri and paper of different composition aged at elevated temperature in humid air. The methods were classified in the order SEC > FS > FTIR > XRD, based on their effectiveness in discriminating the state of papyri degradation. However, the most trustworthy evaluation of the age of papyri samples should rely on several methods. PMID:28382971

  2. Ethics: the risk management tool in clinical research.

    PubMed

    Wadlund, Jill; Platt, Leslie A

    2002-01-01

    Scientific discovery and knowledge expansion in the post genome era holds great promise for new medical technologies and cellular-based therapies with multiple applications that will save and enhance lives. While human beings long have hoped to unlock the mysteries of the molecular basis of life; our society is now on the verge of doing so. But new scientific and technological breakthroughs often come with some risks attached. Research--especially clinical trials and research involving human participants--must be conducted in accordance with the highest ethical and scientific principles. Yet, as the number and complexity of clinical trials increase, so do pressures for new revenue sources and shorter product development cycles, which could have an adverse impact on patient safety. This article explores the use of risk management tools in clinical research.

  3. Puzzle test: A tool for non-analytical clinical reasoning assessment

    PubMed Central

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test’s format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning. PMID:28210603

  4. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    PubMed

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  5. Scientific research tools as an aid to Antarctic logistics

    NASA Astrophysics Data System (ADS)

    Dinn, Michael; Rose, Mike; Smith, Andrew; Fleming, Andrew; Garrod, Simon

    2013-04-01

    Logistics have always been a vital part of polar exploration and research. The more efficient those logistics can be made, the greater the likelihood that research programmes will be delivered on time, safely and to maximum scientific effectiveness. Over the last decade, the potential for symbiosis between logistics and some of the scientific research methods themselves, has increased remarkably; suites of scientific tools can help to optimise logistic efforts, thereby enhancing the effectiveness of further scientific activity. We present one recent example of input to logistics from scientific activities, in support of the NERC iSTAR Programme, a major ice sheet research effort in West Antarctica. We used data output from a number of research tools, spanning a range of techniques and international agencies, to support the deployment of a tractor-traverse system into a remote area of mainland Antarctica. The tractor system was deployed from RRS Ernest Shackleton onto the Abbot Ice Shelf then driven inland to the research area in Pine Island Glacier Data from NASA ICEBRIDGE were used to determine the ice-front freeboard and surface gradients for the traverse route off the ice shelf and onwards into the continent. Quickbird high resolution satellite imagery provided clear images of route track and some insight into snow surface roughness. Polarview satellite data gave sea ice information in the Amundsen Sea, both the previous multi-annual historical characteristics and for real-time information during deployment. Likewise meteorological data contributed historical and information and was used during deployment. Finally, during the tractors' inland journey, ground-based high frequency radar was used to determine a safe, crevasse-free route.

  6. The Research Tools of the Virtual Astronomical Observatory

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Berriman, G. B.; Lazio, T. J.; Project, VAO

    2013-01-01

    Astronomy is being transformed by the vast quantities of data, models, and simulations that are becoming available to astronomers at an ever-accelerating rate. The U.S. Virtual Astronomical Observatory (VAO) has been funded to provide an operational facility that is intended to be a resource for discovery and access of data, and to provide science services that use these data. Over the course of the past year, the VAO has been developing and releasing for community use five science tools: 1) "Iris", for dynamically building and analyzing spectral energy distributions, 2) a web-based data discovery tool that allows astronomers to identify and retrieve catalog, image, and spectral data on sources of interest, 3) a scalable cross-comparison service that allows astronomers to conduct pair-wise positional matches between very large catalogs stored remotely as well as between remote and local catalogs, 4) time series tools that allow astronomers to compute periodograms of the public data held at the NASA Star and Exoplanet Database (NStED) and the Harvard Time Series Center, and 5) A VO-aware release of the Image Reduction and Analysis Facility (IRAF) that provides transparent access to VO-available data collections and is SAMP-enabled, so that IRAF users can easily use tools such as Aladin and Topcat in conjuction with IRAF tasks. Additional VAO services will be built to make it easy for researchers to provide access to their data in VO-compliant ways, to build VO-enabled custom applications in Python, and to respond generally to the growing size and complexity of astronomy data. Acknowledgements: The Virtual Astronomical Observatory (VAO) is managed by the VAO, LLC, a non-profit company established as a partnership of the Associated Universities, Inc. and the Association of Universities for Research in Astronomy, Inc. The VAO is sponsored by the National Science Foundation and the National Aeronautics and Space Administration.

  7. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  8. Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools

    ERIC Educational Resources Information Center

    García, Olga Arranz; Secades, Vidal Alonso

    2013-01-01

    In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…

  9. Critical Race Theory and Interest Convergence as Analytic Tools in Teacher Education Policies and Practices

    ERIC Educational Resources Information Center

    Milner, H. Richard, IV

    2008-01-01

    In "The Report of the AERA Panel on Research and Teacher Education," Cochran-Smith and Zeichner's (2005) review of studies in the field of teacher education revealed that many studies lacked theoretical and conceptual grounding. The author argues that Derrick Bell's (1980) interest convergence, a principle of critical race theory, can be used as…

  10. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research

    PubMed Central

    Torous, John; Kiang, Mathew V; Lorme, Jeanette

    2016-01-01

    Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Objective Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. Methods We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. Results We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Conclusions Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health. PMID:27150677

  11. Designing and implementing full immersion simulation as a research tool.

    PubMed

    Munroe, Belinda; Buckley, Thomas; Curtis, Kate; Morris, Richard

    2016-05-01

    Simulation is a valuable research tool used to evaluate the clinical performance of devices, people and systems. The simulated setting may address concerns unique to complex clinical environments such as the Emergency Department, which make the conduct of research challenging. There is limited evidence available to inform the development of simulated clinical scenarios for the purpose of evaluating practice in research studies, with the majority of literature focused on designing simulated clinical scenarios for education and training. Distinct differences exist in scenario design when implemented in education compared with use in clinical research studies. Simulated scenarios used to assess practice in clinical research must not comprise of any purposeful or planned teaching and be developed with a high degree of validity and reliability. A new scenario design template was devised to develop two standardised simulated clinical scenarios for the evaluation of a new assessment framework for emergency nurses. The scenario development and validation processes undertaken are described and provide an evidence-informed guide to scenario development for future clinical research studies.

  12. Vaccinia Virus: A Tool for Research and Vaccine Development

    NASA Astrophysics Data System (ADS)

    Moss, Bernard

    1991-06-01

    Vaccinia virus is no longer needed for smallpox immunization, but now serves as a useful vector for expressing genes within the cytoplasm of eukaryotic cells. As a research tool, recombinant vaccinia viruses are used to synthesize biologically active proteins and analyze structure-function relations, determine the targets of humoral- and cell-mediated immunity, and investigate the immune responses needed for protection against specific infectious diseases. When more data on safety and efficacy are available, recombinant vaccinia and related poxviruses may be candidates for live vaccines and for cancer immunotherapy.

  13. Tissue fluid pressures - From basic research tools to clinical applications

    NASA Technical Reports Server (NTRS)

    Hargens, Alan R.; Akeson, Wayne H.; Mubarak, Scott J.; Owen, Charles A.; Gershuni, David H.

    1989-01-01

    This paper describes clinical applications of two basic research tools developed and refined in the past 20 years: the wick catheter (for measuring tissue fluid pressure) and the colloid osmometer (for measuring osmotic pressure). Applications of the osmometer include estimations of the reduced osmotic pressure of sickle-cell hemoglobin with deoxygenation, and of reduced swelling pressure of human nucleus pulposus with hydration or upon action of certain enzymes. Clinical uses of the wick-catheter technique include an improvement of diagnosis and treatment of acute and chronic compartment syndromes, the elucidation of the tissue pressure thresholds for neuromuscular dysfunction, and the development of a better tourniquet for orthopedics.

  14. Development of proteasome inhibitors as research tools and cancer drugs

    PubMed Central

    2012-01-01

    The proteasome is the primary site for protein degradation in mammalian cells, and proteasome inhibitors have been invaluable tools in clarifying its cellular functions. The anticancer agent bortezomib inhibits the major peptidase sites in the proteasome’s 20S core particle. It is a “blockbuster drug” that has led to dramatic improvements in the treatment of multiple myeloma, a cancer of plasma cells. The development of proteasome inhibitors illustrates the unpredictability, frustrations, and potential rewards of drug development but also emphasizes the dependence of medical advances on basic biological research. PMID:23148232

  15. Exploring positioning as an analytical tool for understanding becoming mathematics teachers' identities

    NASA Astrophysics Data System (ADS)

    Skog, Kicki; Andersson, Annica

    2015-03-01

    The aim of this article is to explore how a sociopolitical analysis can contribute to a deeper understanding of critical aspects for becoming primary mathematics teachers' identities during teacher education. The question we ask is the following: How may power relations in university settings affect becoming mathematics teachers' subject positioning? We elaborate on the elusive and interrelated concepts of identity, positioning and power, seen as dynamic and changeable. As these concepts represent three interconnected parts of research analysis in an on-going larger project data from different sources will be used in this illustration. In this paper, we clarify the theoretical stance, ground the concepts historically and strive to connect them to research analysis. In this way, we show that power relations and subject positioning in social settings are critical aspects and need to be taken seriously into account if we aim at understanding becoming teachers' identities.

  16. HRMAS NMR spectroscopy combined with chemometrics as an alternative analytical tool to control cigarette authenticity.

    PubMed

    Shintu, Laetitia; Caldarelli, Stefano; Campredon, Mylène

    2013-11-01

    In this paper, we present for the first time the use of high-resolution magic angle spinning nuclear magnetic resonance (HRMAS NMR) spectroscopy combined with chemometrics as an alternative tool for the characterization of tobacco products from different commercial international brands as well as for the identification of counterfeits. Although cigarette filling is a very complex chemical mixture, we were able to discriminate between dark, bright, and additive-free cigarette blends belonging to six different filter-cigarette brands, commercially available, using an approach for which no extraction procedure is required. Second, we focused our study on a specific worldwide-distributed brand for which established counterfeits were available. We discriminated those from their genuine counterparts with 100% accuracy using unsupervised multivariate statistical analysis. The counterfeits that we analyzed showed a higher amount of nicotine and solanesol and a lower content of sugars, all endogenous tobacco leaf metabolites. This preliminary study demonstrates the great potential of HRMAS NMR spectroscopy to help in controlling cigarette authenticity.

  17. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    PubMed

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-01-04

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter.

  18. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    PubMed

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes.

  19. Terahertz pulsed imaging, a novel process analytical tool to investigate the coating characteristics of push-pull osmotic systems.

    PubMed

    Malaterre, Vincent; Pedersen, Maireadh; Ogorka, Joerg; Gurny, Robert; Loggia, Nicoletta; Taday, Philip F

    2010-01-01

    The aim of this study was to investigate coating characteristics of push-pull osmotic systems (PPOS) using three-dimensional terahertz pulsed imaging (3D-TPI) and to detect physical alterations potentially impacting the drug release. The terahertz time-domain reflection signal was used to obtain information on both the spatial distribution of the coating thickness and the coating internal physical mapping. The results showed that (i) the thickness distribution of PPOS coating can be non-destructively analysed using 3D-TPI and (ii) internal physical alterations impacting the drug release kinetics were detectable by using the terahertz time-domain signal. Based on the results, the potential benefits of implementing 3D-TPI as quality control analytical tool were discussed.

  20. Development of a fast analytical tool to identify oil spillages employing infrared spectral indexes and pattern recognition techniques.

    PubMed

    Fresco-Rivera, P; Fernández-Varela, R; Gómez-Carracedo, M P; Ramírez-Villalobos, F; Prada, D; Muniategui, S; Andrade, J M

    2007-11-30

    A fast analytical tool based on attenuated total reflectance mid-IR spectrometry is presented to evaluate the origin of spilled hydrocarbons and to monitor their fate on the environment. Ten spectral band ratios are employed in univariate and multivariate studies (principal components analysis, cluster analysis, density functions - potential curves - and Kohonen self organizing maps). Two indexes monitor typical photooxidation processes, five are related to aromatic characteristics and three study aliphatic and branched chains. The case study considered here comprises 45 samples taken on beaches (from 2002 to 2005) after the Prestige carrier accident off the Galician coast and 104 samples corresponding to weathering studies deployed for the Prestige's fuel, four typical crude oils and a fuel oil. The univariate studies yield insightful views on the gross chemical evolution whereas the multivariate studies allow for simple and straightforward elucidations on whether the unknown samples match the Prestige's fuel. Besides, a good differentiation on the weathering patterns of light and heavy products is obtained.

  1. Analytical aerodynamic model of a high alpha research vehicle wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Cao, Jichang; Garrett, Frederick, Jr.; Hoffman, Eric; Stalford, Harold

    1990-01-01

    A 6 DOF analytical aerodynamic model of a high alpha research vehicle is derived. The derivation is based on wind-tunnel model data valid in the altitude-Mach flight envelope centered at 15,000 ft altitude and 0.6 Mach number with Mach range between 0.3 and 0.9. The analytical models of the aerodynamics coefficients are nonlinear functions of alpha with all control variable and other states fixed. Interpolation is required between the parameterized nonlinear functions. The lift and pitching moment coefficients have unsteady flow parts due to the time range of change of angle-of-attack (alpha dot). The analytical models are plotted and compared with their corresponding wind-tunnel data. Piloted simulated maneuvers of the wind-tunnel model are used to evaluate the analytical model. The maneuvers considered are pitch-ups, 360 degree loaded and unloaded rolls, turn reversals, split S's, and level turns. The evaluation finds that (1) the analytical model is a good representation at Mach 0.6, (2) the longitudinal part is good for the Mach range 0.3 to 0.9, and (3) the lateral part is good for Mach numbers between 0.6 and 0.9. The computer simulations show that the storage requirement of the analytical model is about one tenth that of the wind-tunnel model and it runs twice as fast.

  2. Sharing Data and Analytical Resources Securely in a Biomedical Research Grid Environment

    PubMed Central

    Langella, Stephen; Hastings, Shannon; Oster, Scott; Pan, Tony; Sharma, Ashish; Permar, Justin; Ervin, David; Cambazoglu, B. Barla; Kurc, Tahsin; Saltz, Joel

    2008-01-01

    Objectives To develop a security infrastructure to support controlled and secure access to data and analytical resources in a biomedical research Grid environment, while facilitating resource sharing among collaborators. Design A Grid security infrastructure, called Grid Authentication and Authorization with Reliably Distributed Services (GAARDS), is developed as a key architecture component of the NCI-funded cancer Biomedical Informatics Grid (caBIG™). The GAARDS is designed to support in a distributed environment 1) efficient provisioning and federation of user identities and credentials; 2) group-based access control support with which resource providers can enforce policies based on community accepted groups and local groups; and 3) management of a trust fabric so that policies can be enforced based on required levels of assurance. Measurements GAARDS is implemented as a suite of Grid services and administrative tools. It provides three core services: Dorian for management and federation of user identities, Grid Trust Service for maintaining and provisioning a federated trust fabric within the Grid environment, and Grid Grouper for enforcing authorization policies based on both local and Grid-level groups. Results The GAARDS infrastructure is available as a stand-alone system and as a component of the caGrid infrastructure. More information about GAARDS can be accessed at http://www.cagrid.org. Conclusions GAARDS provides a comprehensive system to address the security challenges associated with environments in which resources may be located at different sites, requests to access the resources may cross institutional boundaries, and user credentials are created, managed, revoked dynamically in a de-centralized manner. PMID:18308979

  3. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  4. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    PubMed

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  5. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    PubMed Central

    Pohanka, Miroslav

    2015-01-01

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance. PMID:26110404

  6. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  7. The Effects of Incentives on Workplace Performance: A Meta-Analytic Review of Research Studies

    ERIC Educational Resources Information Center

    Condly, Steven J.; Clark, Richard E.; Stolovitch, Harold D.

    2003-01-01

    A meta-analytic review of all adequately designed field and laboratory research on the use of incentives to motivate performance is reported. Of approximately 600 studies, 45 qualified. The overall average effect of all incentive programs in all work settings and on all work tasks was a 22% gain in performance. Team-directed incentives had a…

  8. The Nature and Effects of Transformational School Leadership: A Meta-Analytic Review of Unpublished Research

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Sun, Jingping

    2012-01-01

    Background: Using meta-analytic review techniques, this study synthesized the results of 79 unpublished studies about the nature of transformational school leadership (TSL) and its impact on the school organization, teachers, and students. This corpus of research associates TSL with 11 specific leadership practices. These practices, as a whole,…

  9. Development of an analytical tool to study power quality of AC power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. Alan; Kankam, M. David

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of the Electric Power Research Institute's HARMFLO program which assumes a three phase, balanced, AC system with loads of harmonic distortion. The modified power flow program can be used with single phase, AC systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20 kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of present results with published data. Although the results are not exact, the discrepancies are relatively small.

  10. Development of an analytical tool to study power quality of ac power systems for large spacecraft

    NASA Technical Reports Server (NTRS)

    Kraft, L. A.; Kankam, M. D.

    1991-01-01

    A harmonic power flow program applicable to space power systems with sources of harmonic distortion is described. The algorithm is a modification of Electric Power Research Institute's HARMFLO program which assumes a three-phase, balanced, ac system with loads of harmonic distortion. The modified power flow program can be used with single phase, ac systems. Early results indicate that the required modifications and the models developed are quite adequate for the analysis of a 20-kHz testbed built by General Dynamics Corporation. This is demonstrated by the acceptable correlation of the present results with published data. Although the results are not exact, the discrepancies are relatively small.

  11. An evaluation tool for collaborative clinical research centers.

    PubMed

    Tragus, Robin; Cody, Jannine D

    2013-06-01

    There is a need for metrics that describe the full range of services provided by a clinical research unit; given that services have expanded to include such things as investigator training, regulatory compliance monitoring, and budget negotiations. We developed a tool and methodology that allows tracking of these expanded services. This not only allowed us to more accurately describe the work of the research unit staff, but to monitor the status of a study across the entire study lifespan from the idea to the publication. In addition to measuring work, it allows us to anticipate future needs in clinical staff and expertise because we are involved very early in study planning. We also expect that by analyzing these data from many studies over time, we will identify process barriers that will direct future program improvement.

  12. Multicenter patient records research: security policies and tools.

    PubMed

    Behlen, F M; Johnson, S B

    1999-01-01

    The expanding health information infrastructure offers the promise of new medical knowledge drawn from patient records. Such promise will never be fulfilled, however, unless researchers first address policy issues regarding the rights and interests of both the patients and the institutions who hold their records. In this article, the authors analyze the interests of patients and institutions in light of public policy and institutional needs. They conclude that the multicenter study, with Institutional Review Board approval of each study at each site, protects the interests of both. "Anonymity" is no panacea, since patient records are so rich in information that they can never be truly anonymous. Researchers must earn and respect the trust of the public, as responsible stewards of facts about patients' lives. The authors find that computer security tools are needed to administer multicenter patient records studies and describe simple approaches that can be implemented using commercial database products.

  13. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  14. ARM Climate Research Facility: Outreach Tools and Strategies

    NASA Astrophysics Data System (ADS)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  15. Near-infrared spectroscopy as a tool for driving research.

    PubMed

    Liu, Tao; Pelowski, Matthew; Pang, Changle; Zhou, Yuanji; Cai, Jianfeng

    2016-03-01

    Driving a motor vehicle requires various cognitive functions to process surrounding information, to guide appropriate actions, and especially to respond to or integrate with numerous contextual and perceptual hindrances or risks. It is, thus, imperative to examine driving performance and road safety from a perspective of cognitive neuroscience, which considers both the behaviour and the functioning of the brain. However, because of technical limitations of current brain imaging approaches, studies have primarily adopted driving games or simulators to present participants with simulated driving environments that may have less ecological validity. Near-infrared spectroscopy (NIRS) is a relatively new, non-invasive brain-imaging technique allowing measurement of brain activations in more realistic settings, even within real motor vehicles. This study reviews current NIRS driving research and explores NIRS' potential as a new tool to examine driving behaviour, along with various risk factors in natural situations, promoting our understanding about neural mechanisms of driving safety. Practitioner Summary: Driving a vehicle is dependent on a range of neurocognitive processing abilities. Near-infrared spectroscopy (NIRS) is a non-invasive brain-imaging technique allowing measurement of brain activation even in on-road studies within real motor vehicles. This study reviews current NIRS driving research and explores the potential of NIRS as a new tool to examine driving behaviour.

  16. Modeling as a research tool in poultry science.

    PubMed

    Gous, R M

    2014-01-01

    The World's Poultry Science Association (WPSA) is a long-established and unique organization that strives to advance knowledge and understanding of all aspects of poultry science and the poultry industry. Its 3 main aims are education, organization, and research. The WPSA Keynote Lecture, titled "Modeling as a research tool in poultry science," addresses 2 of these aims, namely, the value of modeling in research and education. The role of scientists is to put forward and then to test theories. These theories, or models, may be simple or highly complex, but they are aimed at improving our understanding of a system or the interaction between systems. In developing a model, the scientist must take into account existing knowledge, and in this process gaps in our knowledge of a system are identified. Useful ideas for research are generated in this way, and experiments may be designed specifically to address these issues. The resultant models become more accurate and more useful, and can be used in education and extension as a means of explaining many of the complex issues that arise in poultry science.

  17. Ultrasonic wavefield imaging: Research tool or emerging NDE method?

    NASA Astrophysics Data System (ADS)

    Michaels, Jennifer E.

    2017-02-01

    Ultrasonic wavefield imaging refers to acquiring full waveform data over a region of interest for waves generated by a stationary source. Although various implementations of wavefield imaging have existed for many years, the widespread availability of laser Doppler vibrometers that can acquire signals in the high kHz and low MHz range has resulted in a rapid expansion of fundamental research utilizing full wavefield data. In addition, inspection methods based upon wavefield imaging have been proposed for standalone nondestructive evaluation (NDE) with most of these methods coming from the structural health monitoring (SHM) community and based upon guided waves. If transducers are already embedded in or mounted on the structure as part of an SHM system, then a wavefield-based inspection can potentially take place with very little required disassembly. A frequently-proposed paradigm for wavefield NDE is its application as a follow-up inspection method using embedded SHM transducers as guided wave sources if the in situ SHM system generates an alarm. Discussed here is the broad role of wavefield imaging as it relates to ultrasonic NDE, both as a research tool and as an emerging NDE method. Examples of current research are presented based upon both guided and bulk wavefield imaging in metals and composites, drawing primarily from the author's work. Progress towards wavefield NDE is discussed in the context of defect detection and characterization capabilities, scan times, data quality, and required data analysis. Recent research efforts are summarized that can potentially enable wavefield NDE.

  18. NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Y. R.; Rasbury, J.; Johnson, J.; Barsten, K.; Saile, L.; Watkins, S. D.

    2011-01-01

    In preparation for exploration-class missions, the Exploration Medical Capability (ExMC) element of NASA's Human Research Program (HRP) has compiled a large evidence base, which previously was available only to persons within the NASA community. The evidence base is comprised of several types of data, for example: information on more than 80 medical conditions which could occur during space flight, derived from several sources (including data on incidence and potential outcomes of these medical conditions, as captured in the Integrated Medical Model's Clinical Finding Forms). In addition, approximately 35 gap reports are included in the evidence base, identifying current understanding of the medical challenges for exploration, as well as any gaps in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions. In an effort to make the ExMC information available to the general public and increase collaboration with subject matter experts within and outside of NASA, ExMC has developed an online collaboration tool, very similar to a wiki, titled the NASA Human Research Wiki. The platform chosen for this data sharing, and the potential collaboration it could generate, is a MediaWiki-based application that would house the evidence, allow "read only" access to all visitors to the website, and editorial access to credentialed subject matter experts who have been approved by the Wiki's editorial board. Although traditional wikis allow users to edit information in real time, the NASA Human Research Wiki includes a peer review process to ensure quality and validity of information. The wiki is also intended to be a pathfinder project for other HRP elements that may want to use this type of web-based tool. The wiki website will be released with a subset of the data described and will continue to be populated throughout the year.

  19. Proteomic analysis of synovial fluid as an analytical tool to detect candidate biomarkers for knee osteoarthritis.

    PubMed

    Liao, Weixiong; Li, Zhongli; Zhang, Hao; Li, Ji; Wang, Ketao; Yang, Yimeng

    2015-01-01

    We conducted research to detect the proteomic profiles in synovial fluid (SF) from knee osteoarthritis (OA) patients to better understand the pathogenesis and aetiology of OA. Our long-term goal is to identify reliable candidate biomarkers for OA in SF. The SF proteins obtained from 10 knee OA patients and 10 non-OA patients (9 of whom were patients with a meniscus injury in the knee; 1 had a discoid meniscus in the knee, and all exhibited intact articular cartilage) were separated by two-dimensional electrophoresis (2-DE). The repeatability of the obtained protein spots regarding their intensity was tested via triplicate 2-DE of selected samples. The observed protein expression patterns were subjected to statistical analysis, and differentially expressed protein spots were identified via matrix-assisted laser desorption/ionisation-time of flight/time of flight mass spectrometry (MALDI-TOF/TOF MS). Our analyses showed low intrasample variability and clear intersample variation. Among the protein spots observed on the gels, there were 29 significant differences, of which 22 corresponded to upregulation and 7 to downregulation in the OA group. One of the upregulated protein spots was confirmed to be haptoglobin by mass spectrometry, and the levels of haptoglobin in SF are positively correlated with the severity of OA (r = 0.89, P < 0.001). This study showed that 2-DE could be used under standard conditions to screen SF samples and identify a small subset of proteins in SF that are potential markers associated with OA. Spots of interest identified by mass spectrometry, such as haptoglobin, may be associated with OA severity.

  20. High-resolution entrainment mapping of gastric pacing: a new analytical tool.

    PubMed

    O'Grady, Gregory; Du, Peng; Lammers, Wim J E P; Egbuji, John U; Mithraratne, Pulasthi; Chen, Jiande D Z; Cheng, Leo K; Windsor, John A; Pullan, Andrew J

    2010-02-01

    Gastric pacing has been investigated as a potential treatment for gastroparesis. New pacing protocols are required to improve symptom and motility outcomes; however, research progress has been constrained by a limited understanding of the effects of electrical stimulation on slow-wave activity. This study introduces high-resolution (HR) "entrainment mapping" for the analysis of gastric pacing and presents four demonstrations. Gastric pacing was initiated in a porcine model (typical amplitude 4 mA, pulse width 400 ms, period 17 s). Entrainment mapping was performed using flexible multielectrode arrays (

  1. Game analytics for game user research, part 1: a workshop review and case study.

    PubMed

    El-Nasr, Magy Seif; Desurvire, Heather; Aghabeigi, Bardia; Drachen, Anders

    2013-01-01

    The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design, media studies, and the social sciences. They've extended and modified these methods for different types of digital games, such as social games, casual games, and serious games. This article focuses on quantitative analytics of in-game behavioral user data and its emergent use by the GUR community. The article outlines open problems emerging from several GUR workshops. In addition, a case study of a current collaboration between researchers and a game company demonstrates game analytics' use and benefits.

  2. Review and evaluation of electronic health records-driven phenotype algorithm authoring tools for clinical and translational research

    PubMed Central

    Rasmussen, Luke V; Shaw, Pamela L; Jiang, Guoqian; Kiefer, Richard C; Mo, Huan; Pacheco, Jennifer A; Speltz, Peter; Zhu, Qian; Denny, Joshua C; Pathak, Jyotishman; Thompson, William K; Montague, Enid

    2015-01-01

    Objective To review and evaluate available software tools for electronic health record–driven phenotype authoring in order to identify gaps and needs for future development. Materials and Methods Candidate phenotype authoring tools were identified through (1) literature search in four publication databases (PubMed, Embase, Web of Science, and Scopus) and (2) a web search. A collection of tools was compiled and reviewed after the searches. A survey was designed and distributed to the developers of the reviewed tools to discover their functionalities and features. Results Twenty-four different phenotype authoring tools were identified and reviewed. Developers of 16 of these identified tools completed the evaluation survey (67% response rate). The surveyed tools showed commonalities but also varied in their capabilities in algorithm representation, logic functions, data support and software extensibility, search functions, user interface, and data outputs. Discussion Positive trends identified in the evaluation included: algorithms can be represented in both computable and human readable formats; and most tools offer a web interface for easy access. However, issues were also identified: many tools were lacking advanced logic functions for authoring complex algorithms; the ability to construct queries that leveraged un-structured data was not widely implemented; and many tools had limited support for plug-ins or external analytic software. Conclusions Existing phenotype authoring tools could enable clinical researchers to work with electronic health record data more efficiently, but gaps still exist in terms of the functionalities of such tools. The present work can serve as a reference point for the future development of similar tools. PMID:26224336

  3. Effective Tooling for Linked Data Publishing in Scientific Research

    SciTech Connect

    Purohit, Sumit; Smith, William P.; Chappell, Alan R.; West, Patrick; Lee, Benno; Stephan, Eric G.; Fox, Peter

    2016-02-05

    Challenges that make it difficult to find, share, and combine published data, such as data heterogeneity and resource discovery, have led to increased adoption of semantic data standards and data publishing technologies. To make data more accessible, interconnected and discoverable, some domains are being encouraged to publish their data as Linked Data. Consequently, this trend greatly increases the amount of data that semantic web tools are required to process, store, and interconnect. In attempting to process and manipulate large data sets, tools–ranging from simple text editors to modern triplestores– eventually breakdown upon reaching undefined thresholds. This paper offers a systematic approach that data publishers can use to categorize suitable tools to meet their data publishing needs. We present a real-world use case, the Resource Discovery for Extreme Scale Collaboration (RDESC), which features a scientific dataset(maximum size of 1.4 billion triples) used to evaluate a toolbox for data publishing in climate research. This paper also introduces a semantic data publishing software suite developed for the RDESC project.

  4. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    PubMed

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  5. Conservation of Mass: An Important Tool in Renal Research.

    PubMed

    Sargent, John A

    2016-05-01

    The dialytic treatment of end-stage renal disease (ESRD) patients is based on control of solute concentrations and management of fluid volume. The application of the principal of conservation of mass, or mass balance, is fundamental to the study of such treatment and can be extended to chronic kidney disease (CKD) in general. This review discusses the development and use of mass conservation and transport concepts, incorporated into mathematical models. These concepts, which can be applied to a wide range of solutes of interest, represent a powerful tool for quantitatively guided studies of dialysis issues currently and into the future. Incorporating these quantitative concepts in future investigations is key to achieving positive control of known solutes, and in the analysis of such studies; to relate future research to known results of prior studies; and to help in the understanding of the obligatory physiological perturbations that result from dialysis therapy.

  6. Interactive Publication: The document as a research tool

    PubMed Central

    Thoma, George R.; Ford, Glenn; Antani, Sameer; Demner-Fushman, Dina; Chung, Michael; Simpson, Matthew

    2010-01-01

    The increasing prevalence of multimedia and research data generated by scientific work affords an opportunity to reformulate the idea of a scientific article from the traditional static document, or even one with links to supplemental material in remote databases, to a self-contained, multimedia-rich interactive publication. This paper describes our concept of such a document, and the design of tools for authoring (Forge) and visualization/analysis (Panorama). They are platform-independent applications written in Java, and developed in Eclipse1 using its Rich Client Platform (RCP) framework. Both applications operate on PDF files with links to XML files that define the media type, location, and action to be performed. We also briefly cite the challenges posed by the potentially large size of interactive publications, the need for evaluating their value to improved comprehension and learning, and the need for their long-term preservation by the National Library of Medicine and other libraries. PMID:20657757

  7. Rethinking the Role of Information Technology-Based Research Tools in Students' Development of Scientific Literacy

    NASA Astrophysics Data System (ADS)

    van Eijck, Michiel; Roth, Wolff-Michael

    2007-06-01

    Given the central place IT-based research tools take in scientific research, the marginal role such tools currently play in science curricula is dissatisfying from the perspective of making students scientifically literate. To appropriately frame the role of IT-based research tools in science curricula, we propose a framework that is developed to understand the use of tools in human activity, namely cultural-historical activity theory (CHAT). Accordingly, IT-based research tools constitute central moments of scientific research activity and neither can be seen apart from its objectives, nor can it be considered apart from the cultural-historical determined forms of activity (praxis) in which human subjects participate. Based on empirical data involving students participating in research activity, we point out how an appropriate account of IT-based research tools involves subjects' use of tools with respect to the objectives of research activity and the contribution to the praxis of research. We propose to reconceptualize the role of IT-based research tools as contributing to scientific literacy if students apply these tools with respect to the objectives of the research activity and contribute to praxis of research by evaluating and modifying the application of these tools. We conclude this paper by sketching the educational implications of this reconceptualized role of IT-based research tools.

  8. [Analytics of ambiguity: methodological strategy to the phenomenological research in health].

    PubMed

    Sena, Edite Lago da Silva; Gonçalves, Lucia Hisako Takase; Granzotto, Marcos José Müller; Carvalho, Patricia Anjos Lima; Reis, Helca Franciolli Teixeira

    2010-12-01

    The strategy presented in this paper, called Analytics of ambiguity, is connected to the necessity of understanding findings in researches based on Merleau-Ponty's phenomenology. It was developed through a study of descriptions of life experiences from ten family members, members of a Mutual Help Group for caregivers of Alzheimer's patients, conducted at a university in Florianopolis, Santa Catarina, Brazil. Such descriptions were shown through interviews based on intercorporeal experience, during the writing of a Doctoral Dissertation in Nursing. The application of the Analytics of ambiguity to this study is consistent with other similar studies and opens up possibilities for the understanding of findings in phenomenological researches, specifically those based on the experiential ontology of Merleau-Ponty, for it enables us to recognize consciousness as something non-perceptible and perception as an always ambiguous process.

  9. Research Tensions with the Use of Timed Numeracy Fluency Assessments as a Research Tool

    ERIC Educational Resources Information Center

    Stott, Debbie; Graven, Mellony

    2013-01-01

    In this paper, we describe how we came to use timed fluency activities, along with personal learner reflections on those activities, in our after-school maths club as a complementary research and development tool for assessing the changing levels of learners' mathematical proficiency over time. We use data from one case-study after-school maths…

  10. Nutriproteomics: a promising tool to link diet and diseases in nutritional research.

    PubMed

    Ganesh, Vijayalakshmi; Hettiarachchy, Navam S

    2012-10-01

    Nutriproteomics is a nascent research arena, exploiting the dynamics of proteomic tools to characterize molecular and cellular changes in protein expression and function on a global level as well as judging the interaction of proteins with food nutrients. As nutrients are present in complex mixtures, the bioavailability and functions of each nutrient can be influenced by the presence of other nutrients/compounds and interactions. The first half of this review focuses on the techniques used as nutriproteomic tools for identification, quantification, characterization and analyses of proteins including, two-dimensional polyacrylamide electrophoresis, chromatography, mass spectrometry, microarray and other emerging technologies involving visual proteomics. The second half narrates the potential of nutriproteomics in medical and nutritional research for revolutionizing biomarker and drug development, nutraceutical discovery, biological process modeling, preclinical nutrition linking diet and diseases and structuring ways to a personalized nutrition. Though several challenges such as protein dynamics, analytical complexity, cost and resolution still exist, the scope of applying proteomics to nutrition is rapidly expanding and promising as more holistic strategies are emerging.

  11. Alerting strategies in computerized physician order entry: a novel use of a dashboard-style analytics tool in a children's hospital.

    PubMed

    Reynolds, George; Boyer, Dean; Mackey, Kevin; Povondra, Lynne; Cummings, Allana

    2008-11-06

    Utilizing a commercially available business analytics tool offering dashboard-style graphical indicators and a data warehouse strategy, we have developed an interactive, web-based platform that allows near-real-time analysis of CPOE adoption by hospital area and practitioner specialty. Clinical Decision Support (CDS) metrics include the percentage of alerts that result in a change in clinician decision-making. This tool facilitates adjustments in alert limits in order to reduce alert fatigue.

  12. Models - Another tool for use in global change research

    SciTech Connect

    Wullschleger, S.D.; Baldocchi, D.D.; King, A.W.; Post, W.M. )

    1994-06-01

    Models are increasingly being used in the plant sciences to integrate and extrapolate information derived from laboratory and field investigations. To illustrate the utility of models in global change research, a series of leaf, canopy, ecosystem, and global-scale models are used to explore the response of trees to atmospheric CO[sub 2] enrichment. A biochemical model highlights the effects of elevated CO[sub 2] and temperature on photosynthesis, the consequences of Rubisco down-regulation to leaf and canopy carbon gain, and the relationships among stomatal conductance, transpiration, leaf area, and canopy energy balance. A forest succession model examines the effects of CO[sub 2] on species composition and forest productivity, while a model of the global carbon cycle illustrates the effects of rising CO[sub 2] on terrestrial carbon storage and the interaction of this affect with temperature. We conclude that models are appropriate tools for use both in guiding existing studies and in identifying new hypotheses for future research.

  13. The GATO gene annotation tool for research laboratories.

    PubMed

    Fujita, A; Massirer, K B; Durham, A M; Ferreira, C E; Sogayar, M C

    2005-11-01

    Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO) is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

  14. GPCR-targeting nanobodies: attractive research tools, diagnostics, and therapeutics.

    PubMed

    Mujić-Delić, Azra; de Wit, Raymond H; Verkaar, Folkert; Smit, Martine J

    2014-05-01

    G-protein-coupled receptors (GPCRs) represent a major therapeutic target class. A large proportion of marketed drugs exert their effect through modulation of GPCR function, and GPCRs have been successfully targeted with small molecules. Yet, the number of small new molecular entities targeting GPCRs that has been approved as therapeutics in the past decade has been limited. With new and improved immunization-related technologies and advances in GPCR purification and expression techniques, antibody-based targeting of GPCRs has gained attention. The serendipitous discovery of a unique class of heavy chain antibodies (hcAbs) in the sera of camelids may provide novel GPCR-directed therapies. Antigen-binding fragments of hcAbs, also referred to as nanobodies, combine the advantages of both small molecules (e.g., molecular cavity binding, low production costs) and monoclonal antibodies (e.g., high affinity and specificity). Nanobodies are gaining ground as therapeutics and are also starting to find application as diagnostics and as high-quality tools in GPCR research. Herein, we review recent advances in the use of nanobodies in GPCR research.

  15. VISAD: an interactive and visual analytical tool for the detection of behavioral anomalies in maritime traffic data

    NASA Astrophysics Data System (ADS)

    Riveiro, Maria; Falkman, Göran; Ziemke, Tom; Warston, Håkan

    2009-05-01

    Monitoring the surveillance of large sea areas normally involves the analysis of huge quantities of heterogeneous data from multiple sources (radars, cameras, automatic identification systems, reports, etc.). The rapid identification of anomalous behavior or any threat activity in the data is an important objective for enabling homeland security. While it is worth acknowledging that many existing mining applications support identification of anomalous behavior, autonomous anomaly detection systems are rarely used in the real world. There are two main reasons: (1) the detection of anomalous behavior is normally not a well-defined and structured problem and therefore, automatic data mining approaches do not work well and (2) the difficulties that these systems have regarding the representation and employment of the prior knowledge that the users bring to their tasks. In order to overcome these limitations, we believe that human involvement in the entire discovery process is crucial. Using a visual analytics process model as a framework, we present VISAD: an interactive, visual knowledge discovery tool for supporting the detection and identification of anomalous behavior in maritime traffic data. VISAD supports the insertion of human expert knowledge in (1) the preparation of the system, (2) the establishment of the normal picture and (3) in the actual detection of rare events. For each of these three modules, VISAD implements different layers of data mining, visualization and interaction techniques. Thus, the detection procedure becomes transparent to the user, which increases his/her confidence and trust in the system and overall, in the whole discovery process.

  16. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research.

  17. Episcopic 3D Imaging Methods: Tools for Researching Gene Function

    PubMed Central

    Weninger, Wolfgang J; Geyer, Stefan H

    2008-01-01

    This work aims at describing episcopic 3D imaging methods and at discussing how these methods can contribute to researching the genetic mechanisms driving embryogenesis and tissue remodelling, and the genesis of pathologies. Several episcopic 3D imaging methods exist. The most advanced are capable of generating high-resolution volume data (voxel sizes from 0.5x0.5x1 µm upwards) of small to large embryos of model organisms and tissue samples. Beside anatomy and tissue architecture, gene expression and gene product patterns can be three dimensionally analyzed in their precise anatomical and histological context with the aid of whole mount in situ hybridization or whole mount immunohistochemical staining techniques. Episcopic 3D imaging techniques were and are employed for analyzing the precise morphological phenotype of experimentally malformed, randomly produced, or genetically engineered embryos of biomedical model organisms. It has been shown that episcopic 3D imaging also fits for describing the spatial distribution of genes and gene products during embryogenesis, and that it can be used for analyzing tissue samples of adult model animals and humans. The latter offers the possibility to use episcopic 3D imaging techniques for researching the causality and treatment of pathologies or for staging cancer. Such applications, however, are not yet routine and currently only preliminary results are available. We conclude that, although episcopic 3D imaging is in its very beginnings, it represents an upcoming methodology, which in short terms will become an indispensable tool for researching the genetic regulation of embryo development as well as the genesis of malformations and diseases. PMID:19452045

  18. Enabling laboratory EUV research with a compact exposure tool

    NASA Astrophysics Data System (ADS)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  19. Haystack, a web-based tool for metabolomics research

    PubMed Central

    2014-01-01

    Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization

  20. The relevance of attachment research to psychoanalysis and analytic social psychology.

    PubMed

    Bacciagaluppi, M

    1994-01-01

    The extensive empirical research generated by attachment theory is briefly reviewed, with special reference to transgenerational transmission of attachment patterns, internal working models, cross-cultural, and longitudinal studies. It is claimed that attachment theory and research support the alternative psychoanalytic approach initiated by Ferenczi, especially as regards the re-evaluation of real-life traumatic events, the occurrence of personality splits after childhood trauma, and the aggravation of trauma due to its denial by adults. The concepts of transgenerational transmission and of alternative developmental pathways are further contributions to an alternative psychoanalytic framework. Finally, attention is called to the relevance of the cross-cultural studies to Fromm's analytic social psychology.

  1. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  2. Informetric Theories and Methods for Exploring the Internet: An Analytical Survey of Recent Research Literature.

    ERIC Educational Resources Information Center

    Bar-Ilan, Judit; Peritz, Bluma C.

    2002-01-01

    Presents a selective review of research based on the Internet, using bibliometric and informetric methods and tools. Highlights include data collection methods on the Internet, including surveys, logging, and search engines; and informetric analysis, including citation analysis and content analysis. (Contains 78 references.) (Author/LRW)

  3. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  4. Prompt nuclear analytical techniques for material research in accelerator driven transmutation technologies: Prospects and quantitative analyses

    NASA Astrophysics Data System (ADS)

    Vacík, J.; Hnatowicz, V.; Červená, J.; Peřina, V.; Mach, R.; Peka, I.

    1998-04-01

    Accelerator driven transmutation technology (ADTT) is a promissing way toward liquidation of spent nuclear fuel, nuclear wastes and weapon grade Pu. The ADTT facility comprises a high current (proton) accelerator supplying a subcritical reactor assembly with spallation neutrons. The reactor part is supposed to be cooled by molten fluorides or metals which serve, at the same time, as a carrier of nuclear fuel. Assumed high working temperature (400-600°C) and high radiation load in the subcritical reactor and spallation neutron source put forward the problem of optimal choice of ADTT construction materials, especially from the point of their radiation and corrosion resistance when in contact with liquid working media. The use of prompt nuclear analytical techniques in ADTT related material research is considered and examples of preliminary analytical results obtained using neutron depth profiling method are shown for illustration.

  5. The capsicum transcriptome DB: a "hot" tool for genomic research.

    PubMed

    Góngora-Castillo, Elsa; Fajardo-Jaime, Rubén; Fernández-Cortes, Araceli; Jofre-Garfias, Alba E; Lozoya-Gloria, Edmundo; Martínez, Octavio; Ochoa-Alejo, Neftalí; Rivera-Bustamante, Rafael

    2012-01-01

    Chili pepper (Capsicum annuum) is an economically important crop with no available public genome sequence. We describe a genomic resource to facilitate Capsicum annuum research. A collection of Expressed Sequence Tags (ESTs) derived from five C. annuum organs (root, stem, leaf, flower and fruit) were sequenced using the Sanger method and multiple leaf transcriptomes were deeply sampled using with GS-pyrosequencing. A hybrid assembly of 1,324,516 raw reads yielded 32,314 high quality contigs as validated by coverage and identity analysis with existing pepper sequences. Overall, 75.5% of the contigs had significant sequence similarity to entries in nucleic acid and protein databases; 23% of the sequences have not been previously reported for C. annuum and expand sequence resources for this species. A MySQL database and a user-friendly Web interface were constructed with search-tools that permit queries of the ESTs including sequence, functional annotation, Gene Ontology classification, metabolic pathways, and assembly information. The Capsicum Transcriptome DB is free available from http://www.bioingenios.ira.cinvestav.mx:81/Joomla/

  6. Microgravity as a research tool to improve US agriculture

    NASA Astrophysics Data System (ADS)

    Bula, R. J.; Stankovic, Bratislav

    2000-01-01

    Crop production and utilization are undergoing significant modifications and improvements that emanate from adaptation of recently developed plant biotechnologies. Several innovative technologies will impact US agriculture in the next century. One of these is the transfer of desirable genes from organisms to economically important crop species in a way that cannot be accomplished with traditional plant breeding techniques. Such plant genetic engineering offers opportunities to improve crop species for a number of characteristics as well as use as source materials for specific medical and industrial applications. Although plant genetic engineering is having an impact on development of new crop cultivars, several major constraints limit the application of this technology to selected crop species and genotypes. Consequently, gene transfer systems that overcome these constraints would greatly enhance development of new crop materials. If results of a recent gene transfer experiment conducted in microgravity during a Space Shuttle mission are confirmed, and with the availability of the International Space Station as a permanent space facility, commercial plant transformation activity in microgravity could become a new research tool to improve US agriculture. .

  7. Genetic research in schizophrenia: new tools and future perspectives.

    PubMed

    Bertram, Lars

    2008-09-01

    Genetically, schizophrenia is a complex disease whose pathogenesis is likely governed by a number of different risk factors. While substantial efforts have been made to identify the underlying susceptibility alleles over the past 2 decades, they have been of only limited success. Each year, the field is enriched with nearly 150 additional genetic association studies, each of which either proposes or refutes the existence of certain schizophrenia genes. To facilitate the evaluation and interpretation of these findings, we have recently created a database for genetic association studies in schizophrenia ("SzGene"; available at http://www.szgene.org). In addition to systematically screening the scientific literature for eligible studies, SzGene also reports the results of allele-based meta-analyses for polymorphisms with sufficient genotype data. Currently, these meta-analyses highlight not only over 20 different potential schizophrenia genes, many of which represent the "usual suspects" (eg, various dopamine receptors and neuregulin 1), but also several that were never meta-analyzed previously. All the highlighted loci contain at least one variant showing modest (summary odds ratios approximately 1.20 [range 1.06-1.45]) but nominally significant risk effects. This review discusses some of the strengths and limitations of the SzGene database, which could become a useful bioinformatics tool within the schizophrenia research community.

  8. Databases and registers: useful tools for research, no studies.

    PubMed

    Curbelo, Rafael J; Loza, Estíbaliz; de Yébenes, Maria Jesús García; Carmona, Loreto

    2014-04-01

    There are many misunderstandings about databases. Database is a commonly misused term in reference to any set of data entered into a computer. However, true databases serve a main purpose, organising data. They do so by establishing several layers of relationships; databases are hierarchical. Databases commonly organise data over different levels and over time, where time can be measured as the time between visits, or between treatments, or adverse events, etc. In this sense, medical databases are closely related to longitudinal observational studies, as databases allow the introduction of data on the same patient over time. Basically, we could establish four types of databases in medicine, depending on their purpose: (1) administrative databases, (2) clinical databases, (3) registers, and (4) study-oriented databases. But a database is a useful tool for a large variety of studies, not a type of study itself. Different types of databases serve very different purposes, and a clear understanding of the different research designs mentioned in this paper would prevent many of the databases we launch from being just a lot of work and very little science.

  9. Conducting qualitative research in the British Armed Forces: theoretical, analytical and ethical implications.

    PubMed

    Finnegan, Alan

    2014-06-01

    The aim of qualitative research is to produce empirical evidence with data collected through means such as interviews and observation. Qualitative research encourages diversity in the way of thinking and the methods used. Good studies produce a richness of data to provide new knowledge or address extant problems. However, qualitative research resulting in peer review publications within the Defence Medical Services (DMS) is a rarity. This article aims to help redress this balance by offering direction regarding qualitative research in the DMS with a focus on choosing a theoretical framework, analysing the data and ethical approval. Qualitative researchers need an understanding of the paradigms and theories that underpin methodological frameworks, and this article includes an overview of common theories in phenomenology, ethnography and grounded theory, and their application within the military. It explains qualitative coding: the process used to analyse data and shape the analytical framework. A popular four phase approach with examples from an operational nursing research study is presented. Finally, it tackles the issue of ethical approval for qualitative studies and offers direction regarding the research proposal and participant consent. The few qualitative research studies undertaken in the DMS have offered innovative insights into defence healthcare providing information to inform and change educational programmes and clinical practice. This article provides an extra resource for clinicians to encourage studies that will improve the operational capability of the British Armed Forces. It is anticipated that these guidelines are transferable to research in other Armed Forces and the military Veterans population.

  10. Data-infilling in daily mean river flow records: first results using a visual analytics tool (gapIT)

    NASA Astrophysics Data System (ADS)

    Giustarini, Laura; Parisot, Olivier; Ghoniem, Mohammad; Trebs, Ivonne; Médoc, Nicolas; Faber, Olivier; Hostache, Renaud; Matgen, Patrick; Otjacques, Benoît

    2015-04-01

    Missing data in river flow records represent a loss of information and a serious drawback in water management. An incomplete time series prevents the computation of hydrological statistics and indicators. Also, records with data gaps are not suitable as input or validation data for hydrological or hydrodynamic modelling. In this work we present a visual analytics tool (gapIT), which supports experts to find the most adequate data-infilling technique for daily mean river flow records. The tool performs an automated calculation of river flow estimates using different data-infilling techniques. Donor station(s) are automatically selected based on Dynamic Time Warping, geographical proximity and upstream/downstream relationships. For each gap the tool computes several flow estimates through various data-infilling techniques, including interpolation, multiple regression, regression trees and neural networks. The visual application provides the possibility for the user to select different donor station(s) w.r.t. those automatically selected. The gapIT software was applied to 24 daily time series of river discharge recorded in Luxembourg over the period 01/01/2007 - 31/12/2013. The method was validated by randomly creating artificial gaps of different lengths and positions along the entire records. Using the RMSE and the Nash-Sutcliffe (NS) coefficient as performance measures, the method is evaluated based on a comparison with the actual measured discharge values. The application of the gapIT software to artificial gaps led to satisfactory results in terms of performance indicators (NS>0.8 for more than half of the artificial gaps). A case-by-case analysis revealed that the limited number of reconstructed record gaps characterized by a high RMSE values (NS>0.8) were caused by the temporary unavailability of the most appropriate donor station. On the other hand, some of the gaps characterized by a high accuracy of the reconstructed record were filled by using the data from

  11. Concept Maps as a Research and Evaluation Tool To Assess Conceptual Change in Quantum Physics.

    ERIC Educational Resources Information Center

    Sen, Ahmet Ilhan

    2002-01-01

    Informs teachers about using concept maps as a learning tool and alternative assessment tools in education. Presents research results of how students might use concept maps to communicate their cognitive structure. (Author/KHR)

  12. Tracing the sources of refractory dissolved organic matter in a large artificial lake using multiple analytical tools.

    PubMed

    Nguyen, Hang Vo-Minh; Hur, Jin

    2011-10-01

    Structural and chemical characteristics of refractory dissolved organic matter (RDOM) from seven different sources (algae, leaf litter, reed, compost, field soil, paddy water, treated sewage) were examined using multiple analytical tools, and they were compared with those of RDOM in a large artificial lake (Lake Paldang, Korea). Treated sewage, paddy water, and field soil were distinguished from the other sources investigated by their relatively low specific UV absorbance (SUVA) values and more pronounced fulvic-like versus humic-like fluorescence of the RDOM samples. Microbial derived RDOM from algae and treated sewage showed relatively low apparent molecular weight and a higher fraction of hydrophilic bases relative to the total hydrophilic fraction. For the biopolymer types, the presence of polyhydroxy aromatics with the high abundance of proteins was observed only for vascular plant-based RDOM (i.e., leaf litter and reed). Molecular weight values exhibited positive correlations with the SUVA and the hydrophobic content among the different RDOM, suggesting that hydrophobic and condensed aromatic structures may be the main components of high molecular weight RDOM. Principal component analysis revealed that approximately 77% of the variance in the RDOM characteristics might be explained by the source difference (i.e., terrestrial and microbial derived) and a tendency of further microbial transformation. Combined results demonstrated that the properties of the lake RDOM were largely affected by the upstream sources of field soil, paddy water, and treated sewage, which are characterized by low molecular weight UV-absorbing and non-aromatic structures with relatively high resistance to further degradation.

  13. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    PubMed

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  14. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    NASA Astrophysics Data System (ADS)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  15. Giving Raw Data a Chance to Talk: A Demonstration of Exploratory Visual Analytics with a Pediatric Research Database Using Microsoft Live Labs Pivot to Promote Cohort Discovery, Research, and Quality Assessment

    PubMed Central

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V. Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses. PMID:24808811

  16. Giving raw data a chance to talk: a demonstration of exploratory visual analytics with a pediatric research database using Microsoft Live Labs Pivot to promote cohort discovery, research, and quality assessment.

    PubMed

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses.

  17. Role of nuclear analytical probe techniques in biological trace element research

    SciTech Connect

    Jones, K.W.; Pounds, J.G.

    1985-01-01

    Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace elements metabolism, and of conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research, and examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team. 21 refs., 4 figs., 1 tab.

  18. Research subjects for analytical estimation of core degradation at Fukushima-Daiichi nuclear power plant

    SciTech Connect

    Nagase, F.; Ishikawa, J.; Kurata, M.; Yoshida, H.; Kaji, Y.; Shibamoto, Y.; Amaya, M; Okumura, K.; Katsuyama, J.

    2013-07-01

    Estimation of the accident progress and status inside the pressure vessels (RPV) and primary containment vessels (PCV) is required for appropriate conductance of decommissioning in the Fukushima-Daiichi NPP. For that, it is necessary to obtain additional experimental data and revised models for the estimation using computer codes with increased accuracies. The Japan Atomic Energy Agency (JAEA) has selected phenomena to be reviewed and developed, considering previously obtained information, conditions specific to the Fukushima-Daiichi NPP accident, and recent progress of experimental and analytical technologies. As a result, research and development items have been picked up in terms of thermal-hydraulic behavior in the RPV and PCV, progression of fuel bundle degradation, failure of the lower head of RPV, and analysis of the accident. This paper introduces the selected phenomena to be reviewed and developed, research plans and recent results from the JAEA's corresponding research programs. (authors)

  19. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    ERIC Educational Resources Information Center

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  20. SmartR: An open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa Da Silva, Adriano; Schneider, Reinhard

    2017-03-09

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical, or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements.

  1. Dynamic Visual Acuity: a Functionally Relevant Research Tool

    NASA Technical Reports Server (NTRS)

    Peters, Brian T.; Brady, Rachel A.; Miller, Chris A.; Mulavara, Ajitkumar P.; Wood, Scott J.; Cohen, Helen S.; Bloomberg, Jacob J.

    2010-01-01

    Coordinated movements between the eyes and head are required to maintain a stable retinal image during head and body motion. The vestibulo-ocular reflex (VOR) plays a significant role in this gaze control system that functions well for most daily activities. However, certain environmental conditions or interruptions in normal VOR function can lead to inadequate ocular compensation, resulting in oscillopsia, or blurred vision. It is therefore possible to use acuity to determine when the environmental conditions, VOR function, or the combination of the two is not conductive for maintaining clear vision. Over several years we have designed and tested several tests of dynamic visual acuity (DVA). Early tests used the difference between standing and walking acuity to assess decrements in the gaze stabilization system after spaceflight. Supporting ground-based studies measured the responses from patients with bilateral vestibular dysfunction and explored the effects of visual target viewing distance and gait cycle events on walking acuity. Results from these studies show that DVA is affected by spaceflight, is degraded in patients with vestibular dysfunction, changes with target distance, and is not consistent across the gait cycle. We have recently expanded our research to include studies in which seated subjects are translated or rotated passively. Preliminary results from this work indicate that gaze stabilization ability may differ between similar active and passive conditions, may change with age, and can be affected by the location of the visual target with respect to the axis of motion. Use of DVA as a diagnostic tool is becoming more popular but the functional nature of the acuity outcome measure also makes it ideal for identifying conditions that could lead to degraded vision. By doing so, steps can be taken to alter the problematic environments to improve the man-machine interface and optimize performance.

  2. Research Tools, Tips, and Resources for Financial Aid Administrators. Monograph, A NASFAA Series.

    ERIC Educational Resources Information Center

    Mohning, David D.; Redd, Kenneth E.; Simmons, Barry W., Sr.

    This monograph provides research tools, tips, and resources to financial aid administrators who need to undertake research tasks. It answers: What is research? How can financial aid administrators get started on research projects? What resources are available to help answer research questions quickly and accurately? How can research efforts assist…

  3. Bioanalysis and Analytical Services Research Group at The Municipal Institute for Medical Research IMIM-Hospital del Mar, Spain.

    PubMed

    Segura, Jordi; Pascual, José A; Ventura, Rosa; Gutiérrez-Gallego, Ricardo

    2009-11-01

    Analytical laboratories involved in health-related research are becoming a fundamental part of the advancement of science in this field. Of particular interest to clinical, legal, toxicological, forensic and environmental matters is the analysis of drugs and medications present in biological fluids of consumers or exposed subjects. The established sensitive and reliable work of sports drug-testing laboratories represents an interesting example of a multidisciplinarity approach toward widespread bioanalytical problems. The experiences reported in this article will be of general interest, especially for analysts studying the determination of substances in biological material.

  4. Researcher Effects on Mortality Salience Research: A Meta-Analytic Moderator Analysis

    ERIC Educational Resources Information Center

    Yen, Chih-Long; Cheng, Chung-Ping

    2013-01-01

    A recent meta-analysis of 164 terror management theory (TMT) papers indicated that mortality salience (MS) yields substantial effects (r = 0.35) on worldview and self-esteem-related dependent variables (B. L. Burke, A. Martens, & E. H. Faucher, 2010). This study reanalyzed the data to explore the researcher effects of TMT. By cluster-analyzing…

  5. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  6. MEETING TODAY'S EMERGING CONTAMINANTS WITH TOMORROW'S RESEARCH TOOL

    EPA Science Inventory

    This presentation will explore the many facets of research and development for emerging contaminants within the USEPA's National Exposure Research Laboratories (Athens, Cincinnati, Las Vegas, and Research Triangle Park).

  7. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Astrophysics Data System (ADS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-07-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  8. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  9. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  10. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    PubMed

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  11. Variance decomposition: a tool enabling strategic improvement of the precision of analytical recovery and concentration estimates associated with microorganism enumeration methods.

    PubMed

    Schmidt, P J; Emelko, M B; Thompson, M E

    2014-05-15

    Concentrations of particular types of microorganisms are commonly measured in various waters, yet the accuracy and precision of reported microorganism concentration values are often questioned due to the imperfect analytical recovery of quantitative microbiological methods and the considerable variation among fully replicated measurements. The random error in analytical recovery estimates and unbiased concentration estimates may be attributable to several sources, and knowing the relative contribution from each source can facilitate strategic design of experiments to yield more precise data or provide an acceptable level of information with fewer data. Herein, variance decomposition using the law of total variance is applied to previously published probabilistic models to explore the relative contributions of various sources of random error and to develop tools to aid experimental design. This work focuses upon enumeration-based methods with imperfect analytical recovery (such as enumeration of Cryptosporidium oocysts), but the results also yield insights about plating methods and microbial methods in general. Using two hypothetical analytical recovery profiles, the variance decomposition method is used to explore 1) the design of an experiment to quantify variation in analytical recovery (including the size and precision of seeding suspensions and the number of samples), and 2) the design of an experiment to estimate a single microorganism concentration (including sample volume, effects of improving analytical recovery, and replication). In one illustrative example, a strategically designed analytical recovery experiment with 6 seeded samples would provide as much information as an alternative experiment with 15 seeded samples. Several examples of diminishing returns are illustrated to show that efforts to reduce error in analytical recovery and concentration estimates can have negligible effect if they are directed at trivial error sources.

  12. Accelerator mass spectrometry as a bioanalytical tool for nutritional research

    SciTech Connect

    Vogel, J.S.; Turteltaub, K.W.

    1997-09-01

    Accelerator Mass Spectrometry is a mass spectrometric method of detecting long-lived radioisotopes without regard to their decay products or half-life. The technique is normally applied to geochronology, but recently has been developed for bioanalytical tracing. AMS detects isotope concentrations to parts per quadrillion, quantifying labeled biochemicals to attomole levels in milligram- sized samples. Its advantages over non-isotopeic and stable isotope labeling methods are reviewed and examples of analytical integrity, sensitivity, specificity, and applicability are provided.

  13. Typology of Analytical Errors in Qualitative Educational Research: An Analysis of the 2003-2007 Education Science Dissertations in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    In this research, the level of quality of the qualitative research design used and the analytic mistakes made in the doctorate dissertations carried out in the field of education science in Turkey have been tried to be identified. Case study design has been applied in the study in which qualitative research techniques have been used. The universe…

  14. "This Ain't the Projects": A Researcher's Reflections on the Local Appropriateness of Our Research Tools

    ERIC Educational Resources Information Center

    Martinez, Danny C.

    2016-01-01

    In this article I examine the ways in which Black and Latina/o urban high school youth pressed me to reflexively examine my positionality and that of my research tools during a year-long ethnographic study documenting their communicative repertoires. I reflect on youth comments on my researcher tools, as well as myself, in order to wrestle with…

  15. Improving Students' Understanding of Quantum Measurement. II. Development of Research-Based Learning Tools

    ERIC Educational Resources Information Center

    Zhu, Guangtian; Singh, Chandralekha

    2012-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students'…

  16. Isotope pattern deconvolution as rising tool for isotope tracer studies in environmental research

    NASA Astrophysics Data System (ADS)

    Irrgeher, Johanna; Zitek, Andreas; Prohaska, Thomas

    2014-05-01

    During the last decade stable isotope tracers have emerged as versatile tool in ecological research. Besides 'intrinsic' isotope tracers caused by the natural variation of isotopes, the intentional introduction of 'extrinsic' enriched stable isotope tracers into biological systems has gained significant interest. Hereby the induced change in the natural isotopic composition of an element allows amongst others for studying the fate and fluxes of metals, trace elements and species in organisms or provides an intrinsic marker or tag of particular biological samples. Due to the shoreless potential of this methodology, the number of publications dealing with applications of isotope (double) spikes as tracers to address research questions in 'real world systems' is constantly increasing. However, some isotope systems like the natural Sr isotopic system, although potentially very powerful for this type of application, are still rarely used, mainly because their adequate measurement/determination poses major analytical challenges; as e.g. Sr is available in significant amounts in natural samples. In addition, biological systems underlie complex processes such as metabolism, adsorption/desorption or oxidation/reduction. As a consequence, classic evaluation approaches such as the isotope dilution mass spectrometry equation are often not applicable because of the unknown amount of tracer finally present in the sample. Isotope pattern deconvolution (IPD), based on multiple linear regression, serves as simplified alternative data processing strategy to double spike isotope dilution calculations. The outstanding advantage of this mathematical tool lies in the possibility of deconvolving the isotope pattern in a spiked sample without knowing the quantities of enriched isotope tracer being incorporated into the natural sample matrix as well as the degree of impurities and species-interconversion (e.g. from sample preparation). Here, the potential of IPD for environmental tracer

  17. [Research on infrared safety protection system for machine tool].

    PubMed

    Zhang, Shuan-Ji; Zhang, Zhi-Ling; Yan, Hui-Ying; Wang, Song-De

    2008-04-01

    In order to ensure personal safety and prevent injury accident in machine tool operation, an infrared machine tool safety system was designed with infrared transmitting-receiving module, memory self-locked relay and voice recording-playing module. When the operator does not enter the danger area, the system has no response. Once the operator's whole or part of body enters the danger area and shades the infrared beam, the system will alarm and output an control signal to the machine tool executive element, and at the same time, the system makes the machine tool emergency stop to prevent equipment damaged and person injured. The system has a module framework, and has many advantages including safety, reliability, common use, circuit simplicity, maintenance convenience, low power consumption, low costs, working stability, easy debugging, vibration resistance and interference resistance. It is suitable for being installed and used in different machine tools such as punch machine, pour plastic machine, digital control machine, armor plate cutting machine, pipe bending machine, oil pressure machine etc.

  18. Analytical combustion/emissions research related to the NASA high-speed research program

    NASA Technical Reports Server (NTRS)

    Nguyen, H. Lee

    1991-01-01

    Increasing the pressure and temperature of the engines of new generation supersonic airliners increases the emissions of nitrogen oxides to a level that would have an adverse impact on the Earth's protective ozone layer. In the process of implementing low emissions combustor technologies, NASA Lewis Research Center has pursued a combustion analysis program to guide combustor design processes, to identify potential concepts of greatest promise, and to optimize them at low cost, with short turn-around time. The approach is to upgrade and apply advanced computer programs for gas turbine applications. Efforts have been made to improve the code capabilities of modeling the physics. Test cases and experiments are used for code validation. To provide insight into the combustion process and combustor design, two-dimensional and three-dimensional codes such as KIVA-II and LeRC 3D have been used. These codes are operational and calculations have been performed to guide low emissions combustion experiments.

  19. Introduction to tools and techniques for ceramide-centered research.

    PubMed

    Kitatani, Kazuyuki; Luberto, Chiara

    2010-01-01

    Sphingolipids are important components of eukaryotic cells, many of which function as bioactive signaling molecules. As thoroughly discussed elsewhere in this volume, ceramide, central metabolite of the sphingolipid pathway, plays key roles in a variety of cellular responses. Since the discovery of the bioactive function of ceramide, a growing number of tools and techniques have been and still are being developed in order to better decipher the complexity and implications of ceramide-mediated signaling. With this chapter it is our intention to provide new comers to the sphingolipid arena with a short overview of tools and techniques currently available for the study ofsphingolipid metabolism, with the focus on ceramide.

  20. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  1. Practical library research: a tool for effective library management.

    PubMed

    Schneider, E; Mankin, C J; Bastille, J D

    1995-01-01

    Librarians are being urged to conduct research as one of their professional responsibilities. Many librarians, however, avoid research, because they believe it is beyond their capabilities or resources. This paper discusses the importance of conducting applied research-research directed toward solving practical problems. The paper describes how one library conducted practical research projects, including use studies and surveys, over an eighteen-year period. These projects produced objective data that were used by the library to make management decisions that benefited both the library and its parent institution. This paper encourages other librarians to conduct practical research projects and to share the results with their colleagues through publication in the professional literature.

  2. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    PubMed

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  3. PARAMO: A Parallel Predictive Modeling Platform for Healthcare Analytic Research using Electronic Health Records

    PubMed Central

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng

    2014-01-01

    Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate

  4. Spatial and Temporal Oxygen Dynamics in Macrofaunal Burrows in Sediments: A Review of Analytical Tools and Observational Evidence

    PubMed Central

    Satoh, Hisashi; Okabe, Satoshi

    2013-01-01

    The availability of benthic O2 plays a crucial role in benthic microbial communities and regulates many important biogeochemical processes. Burrowing activities of macrobenthos in the sediment significantly affect O2 distribution and its spatial and temporal dynamics in burrows, followed by alterations of sediment microbiology. Consequently, numerous research groups have investigated O2 dynamics in macrofaunal burrows. The introduction of powerful tools, such as microsensors and planar optodes, to sediment analysis has greatly enhanced our ability to measure O2 dynamics in burrows at high spatial and temporal resolution with minimal disturbance of the physical structure of the sediment. In this review, we summarize recent studies of O2-concentration measurements in burrows with O2 microsensors and O2 planar optodes. This manuscript mainly focuses on the fundamentals of O2 microsensors and O2 planar optodes, and their application in the direct measurement of the spatial and temporal dynamics of O2 concentrations in burrows, which have not previously been reviewed, and will be a useful supplement to recent literature reviews on O2 dynamics in macrofaunal burrows. PMID:23594972

  5. Benchtop-NMR and MRI--a new analytical tool in drug delivery research.

    PubMed

    Metz, Hendrik; Mäder, Karsten

    2008-12-08

    During the last years, NMR spectroscopy and NMR imaging (magnetic resonance imaging, MRI) have been increasingly used to monitor drug delivery systems in vitro and in vivo. However, high installation and running costs of the commonly used superconducting magnet technology limits the application range and prevents the further spread of this non-invasive technology. Benchtop-NMR (BT-NMR) relaxometry uses permanent magnets and is much less cost intensive. BT-NMR relaxometry is commonly used in the food and chemical industry, but so far scarcely used in the pharmaceutical field. The paper shows on several examples that the application field of BT-NMR relaxometry can be extended into the field of drug delivery, including the characterisation of emulsions and lipid ingredients (e.g. the amount and physicochemical state of the lipid) and the monitoring of adsorption characteristics (e.g. oil binding of porous ingredients). The most exciting possibilities of BT-NMR technology are linked with the new development of BT-instruments with imaging capability. BT-MRI examples on the monitoring of hydration and swelling of HPMC-based monolayer and double-layer tablets are shown. BT-MRI opens new MRI opportunities for the non-invasive monitoring of drug delivery processes.

  6. Specially Made for Science: Researchers Develop Online Tools For Collaborations

    ERIC Educational Resources Information Center

    Guterman, Lila

    2008-01-01

    Blogs, wikis, and social-networking sites such as Facebook may get media buzz these days, but for scientists, engineers, and doctors, they are not even on the radar. The most effective tools of the Internet for such people tend to be efforts more narrowly aimed at their needs, such as software that helps geneticists replicate one another's…

  7. Preservice Teachers as Researchers: Using Ethnographic Tools To Interpret Practice.

    ERIC Educational Resources Information Center

    Christensen, Lois McFadyen

    The structures of meaning preservice teachers perceived and interpreted as a result of field placements in a methods course and through the use of ethnographic tools were studied in an ethnographic design. The study involved 11 preservice teachers. It described how they shaped each other's thinking about teaching and it examined how ethnographic…

  8. Chaos Modeling: Increasing Educational Researchers' Awareness of a New Tool.

    ERIC Educational Resources Information Center

    Bobner, Ronald F.; And Others

    Chaos theory is being used as a tool to study a wide variety of phenomena. It is a philosophical and empirical approach that attempts to explain relationships previously thought to be totally random. Although some relationships are truly random, many data appear to be random but reveal repeatable patterns of behavior under further investigation.…

  9. Exploiting the Brachypodium Tool Box in cereal and grass research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    It is now a decade since Brachypodium distachyon was suggested as a model species for temperate grasses and cereals. Since then transformation protocols, large expressed sequence tag (EST) populations, tools for forward and reverse genetic screens, highly refined cytogenetic probes, germplasm coll...

  10. "Mythbusters": A Tool for Teaching Research Methods in Psychology

    ERIC Educational Resources Information Center

    Burkley, Edward; Burkley, Melissa

    2009-01-01

    "Mythbusters" uses multiple research methods to test interesting topics, offering research methods students an entertaining review of course material. To test the effectiveness of "Mythbusters" clips in a psychology research methods course, we systematically selected and showed 4 clips. Students answered questions about the clips, offered their…

  11. Tools for Monitoring Social Media: A Marketing Research Project

    ERIC Educational Resources Information Center

    Veeck, Ann; Hoger, Beth

    2014-01-01

    Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…

  12. Heuristics Diagrams as a Tool to Formatively Assess Teachers' Research

    ERIC Educational Resources Information Center

    Chamizo, J. A.; Garcia-Franco, A.

    2013-01-01

    Many teacher education programs include different forms of teachers doing research. Be it in the form of action research or general inquiries about their practice, it has been argued that when teachers do research on their own practice, they are able to take a more reflective stance towards their work which is necessary to bring about educational…

  13. Research Tool Patents--Rumours of their Death are Greatly Exaggerated

    ERIC Educational Resources Information Center

    Carroll, Peter G.; Roberts, John S.

    2006-01-01

    Using a patented drug during clinical trials is not infringement [35 U.S.C. 271(e)(1)]. Merck v Integra enlarged this "safe harbour" to accommodate preclinical use of drugs and patented "research tools" if "reasonably related" to FDA approval. The decision allowed lower courts, should they wish, to find any use of a research tool, except for…

  14. Analytical performance and comparability of the determination of cholesterol by 12 Lipid-Research Clinics.

    PubMed

    Lippel, K; Ahmed, S; Albers, J J; Bachorik, P; Cooper, G; Helms, R; Williams, J

    1977-09-01

    Twelve Lipid-Research Clinic laboratories performed automated cholesterol analyses on four control-serum pools of known cholesterol concentration, using the Liebermann-Burchard reaction. The analyses were done during a two-year period, with the same standards, methodology, and quality-control procedures. Estimates of analytical bias, variability, and short- and long-term trends for each instrument and for the entire group of LRC instruments are presented. High accuracy, precision, and interlaboratory comparability were achieved through the rigorous standardization and control of the entire analytical procedure. The significance of these results for long-term collaborative studies is discussed. Individual laboratory biases averaged from 0.5 to 2.0% below Abell-Kendall reference values. Between-run variability was about equal to within-run variability and inter-laboratory variation was substantially less than intra-laboratory variation. The total standard deviation for all instruments was about 0.04 g/liter. Only 8-15% of this variation was due to differences between instruments. The between-instrument standard deviation ranged from 0.011 to 0.015 g/liter; the between-run, within-instrument standard deviation ranged from 0.023 to 0.030 g/liter; and within-run standard deviation ranged from 0.023 to 0.028 g/liter. The significance of the achieved results for long-term collaborative studies is discussed.

  15. An overview of city analytics

    PubMed Central

    Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter

    2017-01-01

    We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454

  16. Big Data analytics and cognitive computing - future opportunities for astronomical research

    NASA Astrophysics Data System (ADS)

    Garrett, M. A.

    2014-10-01

    The days of the lone astronomer with his optical telescope and photographic plates are long gone: Astronomy in 2025 will not only be multi-wavelength, but multi-messenger, and dominated by huge data sets and matching data rates. Catalogues listing detailed properties of billions of objects will in themselves require a new industrial-scale approach to scientific discovery, requiring the latest techniques of advanced data analytics and an early engagement with the first generation of cognitive computing systems. Astronomers have the opportunity to be early adopters of these new technologies and methodologies - the impact can be profound and highly beneficial to effecting rapid progress in the field. Areas such as SETI research might favourably benefit from cognitive intelligence that does not rely on human bias and preconceptions.

  17. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  18. Methodological Challenges in Research on Sexual Risk Behavior: I. Item Content, Scaling, and Data Analytical Options

    PubMed Central

    Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.

    2008-01-01

    Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027

  19. Recent and Potential Application of Engineering Tools to Educational Research.

    ERIC Educational Resources Information Center

    Taft, Martin I.

    This paper presents a summary of some recent engineering research in education and identifies some research areas with high payoff potential. The underlying assumption is that a school is a system with a set of subsystems which is potentially susceptible to analysis, design, and eventually some sort of optimization. This assumption leads to the…

  20. Somatic Sensitivity and Reflexivity as Validity Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Green, Jill

    2015-01-01

    Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…

  1. Empirical-Analytical Methodological Research in Environmental Education: Response to a Negative Trend in Methodological and Ideological Discussions

    ERIC Educational Resources Information Center

    Connell, Sharon

    2006-01-01

    The purpose of this paper is to contribute to methodological discourse about research approaches to environmental education. More specifically, the paper explores the current status of the "empirical-analytical methodology" and its "positivist" (traditional- and post-positivist) ideologies, in environmental education research through the critical…

  2. A Review of Energy Dispersive X-Ray Fluorescence (EDXRF) as an Analytical Tool in Numismatic Studies.

    PubMed

    Navas, María José; Asuero, Agustín García; Jiménez, Ana María

    2016-01-01

    Energy dispersive X-ray fluorescence spectrometry (EDXRF) as an analytical technique in studies of ancient coins is summarized and reviewed. Specific EDXRF applications in historical studies, in studies of the corrosion of coins, and in studies of the optimal working conditions of some laser-based treatment for the cleaning of coins are described.

  3. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    SciTech Connect

    Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  4. Applying Web-Based Tools for Research, Engineering, and Operations

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  5. Introduction: new tools for enhancing collaborative endometriosis research.

    PubMed

    Casper, Robert F

    2014-11-01

    This issue of Fertility and Sterility contains four articles by the World Endometriosis Research Foundation whose present objective is global standardization of the collection of phenotypic data and biological samples, designated as the Endometriosis Phenome and Biobanking Harmonisation Project. The aim is to facilitate large-scale international, multicenter trials that are robust, and will result in biomarker and treatment targets to advance research in endometriosis.

  6. Catalogue of space objects and events as a powerful tool for scientific researches on space debris

    NASA Astrophysics Data System (ADS)

    Agapov, V.; Stepanyants, V.; Tuchin, A.; Khutorovsky, Z.

    Wide work on developing and maintenance of the Catalogue of scientific information on space objects and events is continuing at the Keldysh Institute of Applied Mathematics. The work is making in cooperation with Russian company "Space information analytical systems" (KIA Systems). Powerful software tool is developed by now including:- informational core (relational database in RDBMS Oracle 8i environment)with special tools for automatic initial processing and systematization ofdata- software complex for orbital modeling and space objects and eventsdynamical catalogue maintenance- special information - analytical software Informational core covers wide spectrum of data needed for following purposes:- full-scale and high quality modeling of object's motion in near-Earth space(orbital and measurement data, solar flux and geomagnetic indices, Earthrotation parameters etc.)- determination of various events parameters (launches, manoeuvres,fragmentations etc.)- analysis of space debris sources- studying long-term orbital evolution (over several years or tens of years)- other The database is storing huge volume of data including:- optical measurements- TLEs- information about all space launches took place since 1957- information about space missions and programs- manoeuvres- fragmentations- launch sequences for typical orbital insertions- various characteristics for orbital objects (payloads, stages, fragments)- officially released UN and ITU registration data- other By now there are records storing in informational core for more than 28000 orbital objects (both catalogued and not), about all orbital launch attempts since 04.10.1957 (including failed ones), more than 30millions records of orbital information (TLEs, state vectors, polynomial data), more than 200000 optical measurements (normal places) for GEO region objects, calculated data on more than 14 millions of close approaches had taken place during last five years and other data. Software complex for orbital

  7. The Stuttering Treatment Research Evaluation and Assessment Tool (STREAT): Evaluating Treatment Research as Part of Evidence-Based Practice

    ERIC Educational Resources Information Center

    Davidow, Jason H.; Bothe, Anne K.; Bramlett, Robin E.

    2006-01-01

    Purpose: This article presents, and explains the issues behind, the Stuttering Treatment Research Evaluation and Assessment Tool (STREAT), an instrument created to assist clinicians, researchers, students, and other readers in the process of critically appraising reports of stuttering treatment research. Method: The STREAT was developed by…

  8. Big Data Analytics and Machine Intelligence Capability Development at NASA Langley Research Center: Strategy, Roadmap, and Progress

    NASA Technical Reports Server (NTRS)

    Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward

    2016-01-01

    In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.

  9. Utilizing Spectroscopic Research Tools and Software in the Classroom

    NASA Astrophysics Data System (ADS)

    Grubbs, G. S., II

    2015-06-01

    Given today's technological age, it has become crucial to be able to reach the student in a more ''tech-savvy" way than traditional classroom methods afford. Given this, there are already a vast range of software packages available to the molecular spectroscopist that can easily be introduced to the classroom with success. This talk will highlight taking a few of these tools (Gaussian09, SPFIT/SPCAT, the AABS Package, LabViewTM, etc.) and implementing them in the classroom to teach subjects such as Quantum Mechanics and Thermodynamics as well as to aid in the linkage between these subjects. Examples of project implementation on both undergraduate and graduate level students will be presented with a discussion on the successes and failures of such attempts.

  10. Intellectual Property: a powerful tool to develop biotech research

    PubMed Central

    Giugni, Diego; Giugni, Valter

    2010-01-01

    Summary Today biotechnology is perhaps the most important technology field because of the strong health and food implications. However, due to the nature of said technology, there is the need of a huge amount of investments to sustain the experimentation costs. Consequently, investors aim to safeguard as much as possible their investments. Intellectual Property, and in particular patents, has been demonstrated to actually constitute a powerful tool to help them. Moreover, patents represent an extremely important means to disclose biotechnology inventions. Patentable biotechnology inventions involve products as nucleotide and amino acid sequences, microorganisms, processes or methods for modifying said products, uses for the manufacture of medicaments, etc. There are several ways to protect inventions, but all follow the three main patentability requirements: novelty, inventive step and industrial application. PMID:21255349

  11. Modelling turbulent boundary layer flow over fractal-like multiscale terrain using large-eddy simulations and analytical tools.

    PubMed

    Yang, X I A; Meneveau, C

    2017-04-13

    In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'.

  12. The Portable Usability Testing Lab: A Flexible Research Tool.

    ERIC Educational Resources Information Center

    Hale, Michael E.; And Others

    A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to…

  13. Close range photogrammetry--a clinical dental research tool.

    PubMed

    Chadwick, R G

    1992-08-01

    Photogrammetry is the art, science and technology of obtaining reliable information about physical objects through processes of recording and interpreting photographic images. This review outlines the principles of the technique and summarizes the various methodologies and applications in clinical dental research.

  14. Ready Reference Tools: EBSCO Topic Search and SIRS Researcher.

    ERIC Educational Resources Information Center

    Goins, Sharon; Dayment, Lu

    1998-01-01

    Discussion of ready reference and current events collections in high school libraries focuses on a comparison of two CD-ROM services, EBSCO Topic Search and the SIRS Researcher. Considers licensing; access; search strategies; viewing articles; currency; printing; added value features; and advantages of CD-ROMs. (LRW)

  15. New research and tools lead to improved earthquake alerting protocols

    USGS Publications Warehouse

    Wald, David J.

    2009-01-01

    What’s the best way to get alerted about the occurrence and potential impact of an earthquake? The answer to that question has changed dramatically of late, in part due to improvements in earthquake science, and in part by the implementation of new research in the delivery of earthquake information

  16. Reimagining Science Education and Pedagogical Tools: Blending Research with Teaching

    ERIC Educational Resources Information Center

    McLaughlin, Jacqueline S.

    2010-01-01

    The future of higher education in the sciences will be marked by programs that link skilled educators and research scientists from around the world with teachers for professional development and with students for high-impact learning--either virtually or physically in the field. These programs will use technology where possible to build new and…

  17. Friending Adolescents on Social Networking Websites: A Feasible Research Tool

    PubMed Central

    Brockman, Libby N.; Christakis, Dimitri A.; Moreno, Megan A.

    2014-01-01

    Objective Social networking sites (SNSs) are increasingly used for research. This paper reports on two studies examining the feasibility of friending adolescents on SNSs for research purposes. Methods Study 1 took place on www.MySpace.com where public profiles belonging to 18-year-old adolescents received a friend request from an unknown physician. Study 2 took place on www.Facebook.com where college freshmen from two US universities, enrolled in an ongoing research study, received a friend request from a known researcher’s profile. Acceptance and retention rates of friend requests were calculated for both studies. Results Study 1: 127 participants received a friend request; participants were 18 years-old, 62.2% male and 51.8% Caucasian. 49.6% accepted the friend request. After 9 months, 76% maintained the online friendship, 12.7% defriended the study profile and 11% deactivated their profile. Study 2: 338 participants received a friend request; participants were 18 years-old, 56.5% female and 75.1% Caucasian. 99.7% accepted the friend request. Over 12 months, 3.3% defriended the study profile and 4.1% deactivated their profile. These actions were often temporary; the overall 12-month friendship retention rate was 96.1%. Conclusion Friending adolescents on SNSs is feasible and friending adolescents from a familiar profile may be more effective for maintaining online friendship with research participants over time. PMID:25485226

  18. Miniature spinning as a tool for ginning research

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The cotton gin must balance efficient processing and cleaning with adversely affecting the quality of lint through damage and/or failure to remove sufficient material. Substantial research is conducted on all aspects of the cotton gin; however it is difficult to gauge the effect on fiber quality wi...

  19. Online Tools Allow Distant Students to Collaborate on Research Projects

    ERIC Educational Resources Information Center

    T.H.E. Journal, 2005

    2005-01-01

    The Wesleyan Academy and Moravian School in St. Thomas, Virgin Islands, recently joined forces with Evergreen Elementary in Fort Lewis, Wash., to collaborate on a research project using My eCoach Online (http://myecoach.com) as the primary medium to share information, post ideas and findings, and develop inquiry projects on 10 topics about water.…

  20. Some Tools and Techniques of Market Research for Students.

    ERIC Educational Resources Information Center

    Gaither, Gerald H.

    1979-01-01

    The importance of an effective, comprehensive marketing effort by higher education institutions is discussed in light of anticipated enrollment declines. A new professionalism in market research and techniques is called for and it is suggested that an effective marketing effort will provide primary and secondary benefits that can serve as guides…

  1. Administrative Data Linkage as a Tool for Child Maltreatment Research

    ERIC Educational Resources Information Center

    Brownell, Marni D.; Jutte, Douglas P.

    2013-01-01

    Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…

  2. Fish as research tools: alternatives to in vivo experiments.

    PubMed

    Schaeck, Marlien; Van den Broeck, Wim; Hermans, Katleen; Decostere, Annemie

    2013-07-01

    The use of fish in scientific research is increasing worldwide, due to both the rapid expansion of the fish farming industry and growing awareness of questions concerning the humane use of mammalian models in basic research and chemical testing. As fish are lower on the evolutionary scale than mammals, they are considered to be less sentient. Fish models are providing researchers, and those concerned with animal welfare, with opportunities for adhering to the Three Rs principles of refinement, reduction and replacement. However, it should be kept in mind that fish should also be covered by the principles of the Three Rs. Indeed, various studies have shown that fish are capable of nociception, and of experiencing pain in a manner analogous to that in mammals. Thus, emphasis needs to be placed on the development of alternatives that replace, as much as possible, the use of all living vertebrate animals, including fish. This review gives the first comprehensive and critical overview of the existing alternatives for live fish experimental studies. The alternative methods described range from cell and tissue cultures, organ and perfusion models, and embryonic models, to in silico computer and mathematical models. This article aspires to guide scientists in the adoption of the correct alternative methods in their research, and, whenever possible, to reduce the use of live fish.

  3. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees

    PubMed Central

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-01-01

    A novel Protocol Ethics Tool Kit (‘Ethics Tool Kit’) has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. PMID:26811365

  4. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    PubMed

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval.

  5. Using metrology in early prehistoric stone tool research: further work and a brief instrument comparison.

    PubMed

    Evans, A A; Macdonald, D

    2011-01-01

    Early prehistoric research aims to discover the activities of our ancestors and piece together the process of evolution and sociocultural development. A key element in this process is the study of stone tools, particularly how these tools functioned in prehistory. Currently, there are no established quantitative methods that address stone tool function. This article provides a summary of previous studies using metrological methods in stone tool research and details the use of laser scanning confocal microscopy to conduct areal surface analysis using three-dimensional data sets. Research to-date is preliminary but promising and shows that microscopic metrological approaches can provide a quantitative method to identify how stone tools were used. A limited comparison of two metrological systems is presented, the results of which highlight a need for caution and further investigation on the comparability of related data sets.

  6. Research Tools for the Measurement of Pain and Nociception

    PubMed Central

    Johnson, Craig

    2016-01-01

    Simple Summary Pain is an integral aspect of many diseases and it is important to be able to measure it in the clinic so that the progression of disease and the animal’s response to treatment can be monitored. When research into pain is undertaken, it is also important to be able to measure the pain, but this time the aim is to provide meaningful results that will further our understanding of the mechanisms of pain or how it can be better treated. This change in emphasis between clinical and research measurement of pain means that the advantages and disadvantages of the many ways in which pain can be measured influence the choice of the most suitable technique and the way in which it is used. It is important to carefully select the most appropriate methodologies so that the data generated are relevant to the hypotheses being tested. Abstract There are many ways in which pain in animals can be measured and these are based on a variety of phenomena that are related to either the perception of pain or alterations in physical or behavioural features of the animal that are caused by that pain. The features of pain that are most useful for assessment in clinical environments are not always the best to use in a research environment. This is because the aims and objectives of the two settings are different and so whilst particular techniques will have the same advantages and disadvantages in clinical and research environments, these considerations may become more or less of a drawback when moving from one environment to the other. For example, a simple descriptive pain scale has a number of advantages and disadvantages. In a clinical setting the advantages are very useful and the disadvantages are less relevant, but in a research environment the advantages are less important and the disadvantages can become more problematic. This paper will focus on pain in the research environment and after a brief revision of the pathophysiological systems involved will attempt to

  7. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  8. Fish in behavior research: unique tools with a great promise!

    PubMed

    Gerlai, Robert

    2014-08-30

    Fish represent the most diverse class of vertebrates on Earth and also an unprecedented, but as of yet still largely untapped, resource for comparative analyses that can illuminate answers to questions about both how organisms work and how they evolved. The current review is a general discussion of some of the basic principles of why adding new species such as fish to the short list of biomedical model organisms (mainly the house mouse and the rat) has merit. In addition to the general points, it also reviews some questions about a newcomer, the zebrafish, which is rapidly gaining popularity in brain and behavior research. It discusses some examples demonstrating the advantages and disadvantages of the zebrafish mainly in the context of biomedical research. It is followed by other articles that further elaborate on these questions.

  9. Gene Editing: Powerful New Tools for Nephrology Research and Therapy.

    PubMed

    Miyagi, Ayano; Lu, Aiwu; Humphreys, Benjamin D

    2016-10-01

    Biologic research is experiencing a transformation brought about by the ability of programmable nucleases to manipulate the genome. In the recently developed CRISPR/Cas system, short RNA sequences guide the endonuclease Cas9 to any location in the genome, causing a DNA double-strand break (DSB). Repair of DSBs allows the introduction of targeted genetic manipulations with high precision. Cas9-mediated gene editing is simple, scalable, and rapid, and it can be applied to virtually any organism. Here, we summarize the development of modern gene editing techniques and the biology of DSB repair on which these techniques are based. We discuss technical points in applying this technology and review its use in model organisms. Finally, we describe prospects for the use of gene editing to treat human genetic diseases. This technology offers tremendous promise for equipping the nephrology research community to better model and ultimately, treat kidney diseases.

  10. Developing a Research Tool to Gauge Student Metacognition

    NASA Astrophysics Data System (ADS)

    McInerny, Alistair; Boudreaux, Andrew; Rishal, Sepideh; Clare, Kelci

    2012-10-01

    Metacognition refers to the family of thought processes and skills used to evaluate and manage learning. A research and curriculum development project underway at Western Washington University uses introductory physics labs as a context to promote students' abilities to learn and apply metacognitive skills. A required ``narrative reflection'' has been incorporated as a weekly end-of-lab assignment. The goal of the narrative reflection is to encourage and support student metacognition while generating written artifacts that can be used by researchers to study metacognition in action. We have developed a Reflective Thinking Rubric (RTR) to analyze scanned narrative reflections. The RTR codes student writing for Metacognitive Elements, identifiable steps or aspects of metacognitive thinking at a variety of levels of sophistication. We hope to use the RTR to monitor the effect of weekly reflection on metacognitive ability and to search for correlations between metacognitive ability and conceptual understanding.

  11. Bayes' theorem: a paradigm research tool in biomedical sciences.

    PubMed

    Okeh, U M; Ugwu, A C

    2009-04-01

    One of the most interesting applications of the results of probability theory involves estimating unknown probability and making decisions on the basis of new (sample) information. Biomedical scientists often use the Bayesian decision theory for the purposes of computing diagnostic values such as sensitivity and specificity for a certain diagnostic test and from which positive or negative predictive values are obtained in other to make decisions concerning the well-being of the patient. Often times error rates are encountered and estimated from the results of trials of the screening test with a view to calculating the overall case rate for which an accurate estimate is rarely available. The concept of conditional probability takes into account information about the occurrence of one event to predict the probability of another event. It is on this premise that this article presents Bayes' theorem as a vital tool. A brief intuitive development of this theorem and its application in diagnosis is given with minimum proof and examples.

  12. Performance Research Integration Tool (IMPRINT Pro) Maintenance Model Enhancements

    DTIC Science & Technology

    2009-09-01

    September 2009 Air Force Research Laboratory 711 th Human Performance Wing Human Performance Integration Directorate Brooks City-Base, TX...Affairs Office, Brooks City-Base, Texas 78235. Approved for public release; distribution unlimited. Public Affairs Case file no. 09-485, 16 October 2009...Approved through 311th Public Affairs Office, Brooks City-Base, Texas 78235. NOTICES When Government drawings, specifications, or other

  13. Using Focus Groups in the Refinement of a Research Tool

    DTIC Science & Technology

    2007-11-02

    were needed to obtain the different perspectives on palliative care services. In Willgerodt’s (7) project, the research aim was to determine the...McLafferty (1) conducted a study to determine attitudes of different skilled nurses and nurse lecturers towards working with older patients in a...composition. Because this study focused on the attitudes of various types of skilled nurses , McLafferty’s (1) focus groups were organized into

  14. Conceptualising the Use of Facebook in Ethnographic Research: As Tool, as Data and as Context

    ERIC Educational Resources Information Center

    Baker, Sally

    2013-01-01

    This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…

  15. CAMS as a tool for human factors research in spaceflight

    NASA Astrophysics Data System (ADS)

    Sauer, Juergen

    2004-01-01

    The paper reviews a number of research studies that were carried out with a PC-based task environment called Cabin Air Management System (CAMS) simulating the operation of a spacecraft's life support system. As CAMS was a multiple task environment, it allowed the measurement of performance at different levels. Four task components of different priority were embedded in the task environment: diagnosis and repair of system faults, maintaining atmospheric parameters in a safe state, acknowledgement of system alarms (reaction time), and keeping a record of critical system resources (prospective memory). Furthermore, the task environment permitted the examination of different task management strategies and changes in crew member state (fatigue, anxiety, mental effort). A major goal of the research programme was to examine how crew members adapted to various forms of sub-optimal working conditions, such as isolation and confinement, sleep deprivation and noise. None of the studies provided evidence for decrements in primary task performance. However, the results showed a number of adaptive responses of crew members to adjust to the different sub-optimal working conditions. There was evidence for adjustments in information sampling strategies (usually reductions in sampling frequency) as a result of unfavourable working conditions. The results also showed selected decrements in secondary task performance. Prospective memory seemed to be somewhat more vulnerable to sub-optimal working conditions than performance on the reaction time task. Finally, suggestions are made for future research with the CAMS environment.

  16. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    ERIC Educational Resources Information Center

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  17. The "Metaphorical Collage" as a Research Tool in the Field of Education

    ERIC Educational Resources Information Center

    Russo-Zimet, Gila

    2016-01-01

    The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…

  18. Modelling as an indispensible research tool in the information society.

    NASA Astrophysics Data System (ADS)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  19. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  20. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  1. Conceptual Systems Model as a Tool for Hypothesis Generation and Testing in Ecotoxicological Research

    EPA Science Inventory

    Microarray, proteomic, and metabonomic technologies are becoming increasingly accessible as tools for ecotoxicology research. Effective use of these technologies will depend, at least in part, on the ability to apply these techniques within a paradigm of hypothesis driven researc...

  2. Applying Collaborative and e-Learning Tools to Military Distance Learning: A Research Framework

    DTIC Science & Technology

    2000-10-01

    the instructor in such approaches, and the increasing importance of learner-centered approaches to instruction. Appropriate quantitative and qualitative ... research methodologies are then described. A summary of relevant findings on collaborative tools, individual differences, and learning communities is

  3. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  4. Research as a tool for the teaching of epidemiology.

    PubMed

    Soudarssanane, M B; Rotti, S B; Roy, G; Srinivasa, D K

    1994-01-01

    At a medical school in India, undergraduates have been given the opportunity to volunteer to conduct research as a means of improving their knowledge and understanding of epidemiology. First-year clinical students have conducted case-control studies with emphasis on methodological detail. Second-year students have been involved in community-based epidemiological studies. At the intern level, projects related to social factors in health and disease and to health administration have been encouraged. This initiative has been largely welcomed by the students and has yielded highly encouraging results.

  5. Electromagnetic Levitation: A Useful Tool in Microgravity Research

    NASA Technical Reports Server (NTRS)

    Szekely, Julian; Schwartz, Elliot; Hyers, Robert

    1995-01-01

    Electromagnetic levitation is one area of the electromagnetic processing of materials that has uses for both fundamental research and practical applications. This technique was successfully used on the Space Shuttle Columbia during the Spacelab IML-2 mission in July 1994 as a platform for accurately measuring the surface tensions of liquid metals and alloys. In this article, we discuss the key transport phenomena associated with electromagnetic levitation, the fundamental relationships associated with thermophysical property measurement that can be made using this technique, reasons for working in microgravity, and some of the results obtained from the microgravity experiments.

  6. Digital storytelling: an innovative tool for practice, education, and research.

    PubMed

    Lal, Shalini; Donnelly, Catherine; Shin, Jennifer

    2015-01-01

    Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.

  7. NASA Global Hawk: A New Tool for Earth Science Research

    NASA Technical Reports Server (NTRS)

    Hall, Phill

    2009-01-01

    This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.

  8. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research.

    PubMed

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people's pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan's implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model's membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises.

  9. Experimental and Analytical Research on Resonance Phenomena of Vibrating Head with MRE Regulating Element

    NASA Astrophysics Data System (ADS)

    Miedzińska, D.; Gieleta, R.; Osiński, J.

    2015-02-01

    A vibratory pile hammer (VPH) is a mechanical device used to drive steel piles as well as tube piles into soil to provide foundation support for buildings or other structures. In order to increase the stability and the efficiency of the VPH work in the over-resonance frequency, a new VPH construction was developed at the Military University of Technology. The new VPH contains a system of counter-rotating eccentric weights, powered by hydraulic motors, and designed in such a way that horizontal vibrations cancel out, while vertical vibrations are transmitted into the pile. This system is suspended in the static parts by the adaptive variable stiffness pillows based on a smart material, magnetorheological elastomer (MRE), whose rheological and mechanical properties can be reversibly and rapidly controlled by an external magnetic field. The work presented in the paper is a part of the modified VPH construction design process. It concerns the experimental research on the vibrations during the piling process and the analytical analyses of the gained signal. The results will be applied in the VPH control system.

  10. The NASA Human Research Wiki - An Online Collaboration Tool

    NASA Technical Reports Server (NTRS)

    Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi

    2012-01-01

    The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.

  11. Novel Applications of Lanthanoides as Analytical or Diagnostic Tools in the Life Sciences by ICP-MS-based Techniques

    NASA Astrophysics Data System (ADS)

    Müller, Larissa; Traub, Heike; Jakubowski, Norbert

    2016-11-01

    Inductively coupled plasma mass spectrometry (ICP-MS) is a well-established analytical method for multi-elemental analysis in particular for elements at trace and ultra-trace levels. It has found acceptance in various application areas during the last decade. ICP-MS is also more and more applied for detection in the life sciences. For these applications, ICP-MS excels by a high sensitivity, which is independent of the molecular structure of the analyte, a wide linear dynamic range and by excellent multi-element capabilities. Furthermore, methods based on ICP-MS offer simple quantification concepts, for which usually (liquid) standards are applied, low matrix effects compared to other conventional bioanalytical techniques, and relative limits of detection (LODs) in the low pg g-1 range and absolute LODs down to the attomol range. In this chapter, we focus on new applications where the multi-element capability of ICP-MS is used for detection of lanthanoides or rare earth elements, which are applied as elemental stains or tags of biomolecules and in particular of antibodies.

  12. Using quality assessment tools to critically appraise ageing research: a guide for clinicians.

    PubMed

    Harrison, Jennifer Kirsty; Reid, James; Quinn, Terry J; Shenkin, Susan Deborah

    2016-12-07

    Evidence based medicine tells us that we should not accept published research at face value. Even research from established teams published in the highest impact journals can have methodological flaws, biases and limited generalisability. The critical appraisal of research studies can seem daunting, but tools are available to make the process easier for the non-specialist. Understanding the language and process of quality assessment is essential when considering or conducting research, and is also valuable for all clinicians who use published research to inform their clinical practice.We present a review written specifically for the practising geriatrician. This considers how quality is defined in relation to the methodological conduct and reporting of research. Having established why quality assessment is important, we present and critique tools which are available to standardise quality assessment. We consider five study designs: RCTs, non-randomised studies, observational studies, systematic reviews and diagnostic test accuracy studies. Quality assessment for each of these study designs is illustrated with an example of published cognitive research. The practical applications of the tools are highlighted, with guidance on their strengths and limitations. We signpost educational resources and offer specific advice for use of these tools.We hope that all geriatricians become comfortable with critical appraisal of published research and that use of the tools described in this review - along with awareness of their strengths and limitations - become a part of teaching, journal clubs and practice.

  13. Electrostatic Levitation: A Tool to Support Materials Research in Microgravity

    NASA Technical Reports Server (NTRS)

    Rogers, Jan; SanSoucie, Mike

    2012-01-01

    Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.

  14. Performance calculations for battery power supplies as laboratory research tools

    SciTech Connect

    Scanlon, J.J.; Rolader, G.E.; Jamison, K.A. ); Petresky, H. )

    1991-01-01

    Electromagnetic Launcher (EML) research at the Air Force Armament Laboratory, Hypervelocity Launcher Branch (AFATL/SAH), Eglin AFB, has focused on developing the technologies required for repetitively launching several kilogram payloads to high velocities. Previous AFATL/SAH experiments have been limited by the available power supply resulting in small muzzle energies on the order of 100's of kJ. In an effort to advance the development of EML's, AFATL/SAH has designed and constructed a battery power supply (BPS) capable of providing several mega-Amperes of current for several seconds. This system consists of six modules each containing 2288 automotive batteries which may be connected in two different series - parallel arrangements. In this paper the authors define the electrical characteristics of the AFATL Battery Power supply at the component level.

  15. Nucleic Acid Aptamers: Research Tools in Disease Diagnostics and Therapeutics

    PubMed Central

    Yadava, Pramod K.

    2014-01-01

    Aptamers are short sequences of nucleic acid (DNA or RNA) or peptide molecules which adopt a conformation and bind cognate ligands with high affinity and specificity in a manner akin to antibody-antigen interactions. It has been globally acknowledged that aptamers promise a plethora of diagnostic and therapeutic applications. Although use of nucleic acid aptamers as targeted therapeutics or mediators of targeted drug delivery is a relatively new avenue of research, one aptamer-based drug “Macugen” is FDA approved and a series of aptamer-based drugs are in clinical pipelines. The present review discusses the aspects of design, unique properties, applications, and development of different aptamers to aid in cancer diagnosis, prevention, and/or treatment under defined conditions. PMID:25050359

  16. Nucleic acid aptamers: research tools in disease diagnostics and therapeutics.

    PubMed

    Santosh, Baby; Yadava, Pramod K

    2014-01-01

    Aptamers are short sequences of nucleic acid (DNA or RNA) or peptide molecules which adopt a conformation and bind cognate ligands with high affinity and specificity in a manner akin to antibody-antigen interactions. It has been globally acknowledged that aptamers promise a plethora of diagnostic and therapeutic applications. Although use of nucleic acid aptamers as targeted therapeutics or mediators of targeted drug delivery is a relatively new avenue of research, one aptamer-based drug "Macugen" is FDA approved and a series of aptamer-based drugs are in clinical pipelines. The present review discusses the aspects of design, unique properties, applications, and development of different aptamers to aid in cancer diagnosis, prevention, and/or treatment under defined conditions.

  17. Peak-bridges due to in-column analyte transformations as a new tool for establishing molecular connectivities by comprehensive two-dimensional gas chromatography-mass spectrometry.

    PubMed

    Filippi, Jean-Jacques; Cocolo, Nicolas; Meierhenrich, Uwe J

    2015-02-27

    Comprehensive two-dimensional gas chromatography-mass spectrometry (GC×GC-MS) has been shown to permit for the unprecedented chromatographic resolution of volatile analytes encompassing various families of organic compounds. However, peak identification based on retention time, two-dimensional mapping, and mass spectrometric fragmentation only, is not a straightforward task yet. The possibility to establish molecular links between constituents is of crucial importance to understand the overall chemistry of any sample, especially in natural extracts where biogenetically related isomeric structures are often abundant. We here present a new way of using GC×GC that allows searching for those molecular connectivities. Analytical investigations of essential oil constituents by means of GC×GC-MS permitted to observe in real time the thermally-induced transformations of various sesquiterpenic derivatives. These transformations generated a series of well-defined two-dimensional peak bridges within the 2D-chromatograms connecting parent and daughter molecules, thus permitting to build a clear scheme of structural relationship between the different constituents. GC×GC-MS appears here as a tool for investigating chromatographic phenomena and analyte transformations that could not be understood with conventional GC-MS only.

  18. Translating the Theoretical into Practical: A Logical Framework of Functional Analytic Psychotherapy Interactions for Research, Training, and Clinical Purposes

    ERIC Educational Resources Information Center

    Weeks, Cristal E.; Kanter, Jonathan W.; Bonow, Jordan T.; Landes, Sara J.; Busch, Andrew M.

    2012-01-01

    Functional analytic psychotherapy (FAP) provides a behavioral analysis of the psychotherapy relationship that directly applies basic research findings to outpatient psychotherapy settings. Specifically, FAP suggests that a therapist's in vivo (i.e., in-session) contingent responding to targeted client behaviors, particularly positive reinforcement…

  19. The Evaluation of the ESEA Title VII Spanish/English Bilingual Education Program: Research Design and Analytic Approaches.

    ERIC Educational Resources Information Center

    Coles, Gary J.

    The research design and analytic approaches used in the national evaluation of the Elementary Secondary Education Act (ESEA) Title VII Spanish/English Bilingual Education Program are described. This study evaluated the entire program, as opposed to individual projects, and had four goals: to (1) determine the cognitive and affective impact of…

  20. Advanced imaging microscope tools applied to microgravity research investigations

    NASA Astrophysics Data System (ADS)

    Peterson, L.; Samson, J.; Conrad, D.; Clark, K.

    1998-01-01

    The inability to observe and interact with experiments on orbit has been an impediment for both basic research and commercial ventures using the shuttle. In order to open the frontiers of space, the Center for Microgravity Automation Technology has developed a unique and innovative system for conducting experiments at a distance, the ``Remote Scientist.'' The Remote Scientist extends laboratory automation capability to the microgravity environment. While the Remote Scientist conceptually encompasses a broad spectrum of elements and functionalities, the development approach taken is to: • establish a baseline capability that is both flexible and versatile • incrementally augment the baseline with additional functions over time. Since last year, the application of the Remote Scientist has changed from protein crystal growth to tissue culture, specifically, the development of skeletal muscle under varying levels of tension. This system includes a series of bioreactor chambers that allow for three-dimensional growth of muscle tissue on a membrane suspended between the two ends of a programmable force transducer that can provide automated or investigator-initiated tension on the developing tissue. A microscope objective mounted on a translation carriage allows for high-resolution microscopy along a large area of the tissue. These images will be mosaiced on orbit to detect features and structures that span multiple images. The use of fluorescence and pseudo-confocal microscopy will maximize the observational capabilities of this system. A series of ground-based experiments have been performed to validate the bioreactor, the force transducer, the translation carriage and the image acquisition capabilities of the Remote Scientist. • The bioreactor is capable of sustaining three dimensional tissue culture growth over time. • The force transducer can be programmed to provide static tension on cells or to simulate either slow or fast growth of underlying tissues in

  1. Citizen Science as a New Tool in Dog Cognition Research.

    PubMed

    Stewart, Laughlin; MacLean, Evan L; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology.

  2. Genetic resources offer efficient tools for rice functional genomics research.

    PubMed

    Lo, Shuen-Fang; Fan, Ming-Jen; Hsing, Yue-Ie; Chen, Liang-Jwu; Chen, Shu; Wen, Ien-Chie; Liu, Yi-Lun; Chen, Ku-Ting; Jiang, Mirng-Jier; Lin, Ming-Kuang; Rao, Meng-Yen; Yu, Lin-Chih; Ho, Tuan-Hua David; Yu, Su-May

    2016-05-01

    Rice is an important crop and major model plant for monocot functional genomics studies. With the establishment of various genetic resources for rice genomics, the next challenge is to systematically assign functions to predicted genes in the rice genome. Compared with the robustness of genome sequencing and bioinformatics techniques, progress in understanding the function of rice genes has lagged, hampering the utilization of rice genes for cereal crop improvement. The use of transfer DNA (T-DNA) insertional mutagenesis offers the advantage of uniform distribution throughout the rice genome, but preferentially in gene-rich regions, resulting in direct gene knockout or activation of genes within 20-30 kb up- and downstream of the T-DNA insertion site and high gene tagging efficiency. Here, we summarize the recent progress in functional genomics using the T-DNA-tagged rice mutant population. We also discuss important features of T-DNA activation- and knockout-tagging and promoter-trapping of the rice genome in relation to mutant and candidate gene characterizations and how to more efficiently utilize rice mutant populations and datasets for high-throughput functional genomics and phenomics studies by forward and reverse genetics approaches. These studies may facilitate the translation of rice functional genomics research to improvements of rice and other cereal crops.

  3. Citizen Science as a New Tool in Dog Cognition Research

    PubMed Central

    Stewart, Laughlin; MacLean, Evan L.; Ivy, David; Woods, Vanessa; Cohen, Eliot; Rodriguez, Kerri; McIntyre, Matthew; Mukherjee, Sayan; Call, Josep; Kaminski, Juliane; Miklósi, Ádám; Wrangham, Richard W.; Hare, Brian

    2015-01-01

    Family dogs and dog owners offer a potentially powerful way to conduct citizen science to answer questions about animal behavior that are difficult to answer with more conventional approaches. Here we evaluate the quality of the first data on dog cognition collected by citizen scientists using the Dognition.com website. We conducted analyses to understand if data generated by over 500 citizen scientists replicates internally and in comparison to previously published findings. Half of participants participated for free while the other half paid for access. The website provided each participant a temperament questionnaire and instructions on how to conduct a series of ten cognitive tests. Participation required internet access, a dog and some common household items. Participants could record their responses on any PC, tablet or smartphone from anywhere in the world and data were retained on servers. Results from citizen scientists and their dogs replicated a number of previously described phenomena from conventional lab-based research. There was little evidence that citizen scientists manipulated their results. To illustrate the potential uses of relatively large samples of citizen science data, we then used factor analysis to examine individual differences across the cognitive tasks. The data were best explained by multiple factors in support of the hypothesis that nonhumans, including dogs, can evolve multiple cognitive domains that vary independently. This analysis suggests that in the future, citizen scientists will generate useful datasets that test hypotheses and answer questions as a complement to conventional laboratory techniques used to study dog psychology. PMID:26376443

  4. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    PubMed

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  5. Medical technology assessment: the use of the analytic hierarchy process as a tool for multidisciplinary evaluation of medical devices.

    PubMed

    Hummel, J M; van Rossum, W; Verkerke, G J; Rakhorst, G

    2000-11-01

    Most types of medical technology assessment are performed only after the technology has been developed. Consequently, they have only minor effects on changes in clinical practice. Our study introduces a new method of constructive medical technology assessment that can change the development and diffusion of a medical device to improve its later clinical effectiveness. The method, based on Saaty's Analytic Hierarchy Process, quantitatively supports discussions between various parties involved in technological development and diffusion. We applied this method in comparing a new blood pump with two competitors based on technical, medical and social requirements. These discussions changed the evaluators' perspectives, reduced diasagreements, and ended in a reliable evaluation of the pump's performance. On the basis of these results, adaptations were derived which improved the design and diffusion of the blood pump. This application shows the adequate potential of our method to steer technological development and diffusion of artificial organs.

  6. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  7. Simple Tools to Facilitate Project Management of a Nursing Research Project.

    PubMed

    Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret

    2016-07-01

    Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained.

  8. Emerging Imaging Tools for Use with Traumatic Brain Injury Research

    PubMed Central

    Wilde, Elisabeth A.; Tong, Karen A.; Holshouser, Barbara A.

    2012-01-01

    Abstract This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children. PMID:21787167

  9. Tools for Educational Data Mining: A Review

    ERIC Educational Resources Information Center

    Slater, Stefan; Joksimovic, Srecko; Kovanovic, Vitomir; Baker, Ryan S.; Gasevic, Dragan

    2017-01-01

    In recent years, a wide array of tools have emerged for the purposes of conducting educational data mining (EDM) and/or learning analytics (LA) research. In this article, we hope to highlight some of the most widely used, most accessible, and most powerful tools available for the researcher interested in conducting EDM/LA research. We will…

  10. Satellite telemetry: A new tool for wildlife research and management

    USGS Publications Warehouse

    Fancy, Steven G.; Pank, Larry F.; Douglas, David C.; Curby, Catherine H.; Garner, Gerald W.; Amstrup, Steven C.; Regelin, Wayne L.

    1998-01-01

    operation, the UHF (ultra-high frequency) signal failed on three of 32 caribou transmitters and 10 of 36 polar bear transmitters.A geographic information system (GIS) incorporating other databases (e.g., land cover, elevation, slope, aspect, hydrology, ice distribution) was used to analyze and display detailed locational and behavioral data collected via satellite. Examples of GIS applications to research projects using satellite telemetry and examples of detailed movement patterns of caribou and polar bears are presented. This report includes documentation for computer software packages for processing Argos data and presents developments, as of March 1987, in transmitter design, data retrieval using a local user terminal, computer software, and sensor development and calibration.

  11. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  12. Ethical Review as a Tool for Enhancing Postgraduate Supervision and Research Outcomes in the Creative Arts

    ERIC Educational Resources Information Center

    Romano, Angela

    2016-01-01

    This article outlines the potential for Research Higher Degree (RHD) supervisors at universities and similar institutions to use ethical review as a constructive, dynamic tool in guiding RHD students in the timely completion of effective, innovative research projects. Ethical review involves a bureaucratized process for checking that researchers…

  13. Capacity-to-Consent in Psychiatric Research: Development and Preliminary Testing of a Screening Tool

    ERIC Educational Resources Information Center

    Zayas, Luis H.; Cabassa, Leopoldo J.; Perez, M. Carmela

    2005-01-01

    Objective: Assuring research participants' capacity to provide informed consent has become increasingly important in health and mental health research, and each study faces unique capacity-assessment challenges, possibly requiring its own screening tool. This article describes the development and preliminary testing of a capacity-to-consent tool…

  14. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  15. Toward a quality guide to facilitate the transference of analytical methods from research to testing laboratories: a case study.

    PubMed

    Bisetty, Krisnha; Gumede, Njabulo Joyfull; Escuder-Gilabert, Laura; Sagrado, Salvador

    2009-01-01

    At present, there is no single viewpoint that defines QA strategies in analytical chemistry. On the other hand, there are no unique protocols defining a set of analytical tasks and decision criteria to be performed during the method development phase (e.g., by a single research laboratory) in order to facilitate the transference to the testing laboratories intending to adapt, validate, and routinely use this method. This study proposes general criteria, a priori valid for any developed method, recommended as a provisional quality guide containing the minimum internal tasks necessary to publish new analytical method results. As an application, the selection of some basic internal quality tasks and the corresponding accepted criteria are adapted to a concrete case study: indirect differential pulse polarographic determination of nitrate in water samples according to European Commission requisites. Extra tasks to be performed by testing laboratories are also outlined.

  16. The ABCs of incentive-based treatment in health care: a behavior analytic framework to inform research and practice

    PubMed Central

    Meredith, Steven E; Jarvis, Brantley P; Raiff, Bethany R; Rojewski, Alana M; Kurti, Allison; Cassidy, Rachel N; Erb, Philip; Sy, Jolene R; Dallery, Jesse

    2014-01-01

    Behavior plays an important role in health promotion. Exercise, smoking cessation, medication adherence, and other healthy behavior can help prevent, or even treat, some diseases. Consequently, interventions that promote healthy behavior have become increasingly common in health care settings. Many of these interventions award incentives contingent upon preventive health-related behavior. Incentive-based interventions vary considerably along several dimensions, including who is targeted in the intervention, which behavior is targeted, and what type of incentive is used. More research on the quantitative and qualitative features of many of these variables is still needed to inform treatment. However, extensive literature on basic and applied behavior analytic research is currently available to help guide the study and practice of incentive-based treatment in health care. In this integrated review, we discuss how behavior analytic research and theory can help treatment providers design and implement incentive-based interventions that promote healthy behavior. PMID:24672264

  17. In situ protein secondary structure determination in ice: Raman spectroscopy-based process analytical tool for frozen storage of biopharmaceuticals.

    PubMed

    Roessl, Ulrich; Leitgeb, Stefan; Pieters, Sigrid; De Beer, Thomas; Nidetzky, Bernd

    2014-08-01

    A Raman spectroscopy-based method for in situ monitoring of secondary structural composition of proteins during frozen and thawed storage was developed. A set of reference proteins with different α-helix and β-sheet compositions was used for calibration and validation in a chemometric approach. Reference secondary structures were quantified with circular dichroism spectroscopy in the liquid state. Partial least squares regression models were established that enable estimation of secondary structure content from Raman spectra. Quantitative secondary structure determination in ice was accomplished for the first time and correlation with existing (qualitative) protein structural data from the frozen state was achieved. The method can be used in the presence of common stabilizing agents and is applicable in an industrial freezer setup. Raman spectroscopy represents a powerful, noninvasive, and flexibly applicable tool for protein stability monitoring during frozen storage.

  18. [Tools to enhance the quality and transparency of health research reports: reporting guidelines].

    PubMed

    Galvão, Taís Freire; Silva, Marcus Tolentino; Garcia, Leila Posenato

    2016-01-01

    Scientific writing is the cornestone for publishing the results of research. Reporting guidelines are important tools for all those involved in the process of research production and report writing. These guidelines detail what is expected to be found in each section of a report for a given study design. The EQUATOR Network (Enhancing the QUAlity and Transparency Of health Research) is an international initiative that seeks to improve the reliability and the value of health research literature by promoting transparent and accurate reporting and wider use of robust reporting guidelines. The use of reporting guidelines has contributed to improved reports as well as increased quality of research methods. Reporting guidelines need to be publicized in order to increase knowledge about these essential tools among health researchers. Encouraging their use by journals is key to enhancing the quality of scientific publications.

  19. Dynamic 3D visual analytic tools: a method for maintaining situational awareness during high tempo warfare or mass casualty operations

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd E.

    2010-04-01

    Maintaining Situational Awareness (SA) is crucial to the success of high tempo operations, such as war fighting and mass casualty events (bioterrorism, natural disasters). Modern computer and software applications attempt to provide command and control manager's situational awareness via the collection, integration, interrogation and display of vast amounts of analytic data in real-time from a multitude of data sources and formats [1]. At what point does the data volume and displays begin to erode the hierarchical distributive intelligence, command and control structure of the operation taking place? In many cases, people tasked with making decisions, have insufficient experience in SA of high tempo operations and become overwhelmed easily as vast amounts of data begin to be displayed in real-time as an operation unfolds. In these situations, where data is plentiful and the relevance of the data changes rapidly, there is a chance for individuals to target fixate on those data sources they are most familiar. If these individuals fall into this type of pitfall, they will exclude other data that might be just as important to the success of the operation. To counter these issues, it is important that the computer and software applications provide a means for prompting its users to take notice of adverse conditions or trends that are critical to the operation. This paper will discuss a new method of displaying data called a Crisis ViewTM, that monitors critical variables that are dynamically changing and allows preset thresholds to be created to prompt the user when decisions need to be made and when adverse or positive trends are detected. The new method will be explained in basic terms, with examples of its attributes and how it can be implemented.

  20. Production Workers' Literacy and Numeracy Practices: Using Cultural-Historical Activity Theory (CHAT) as an Analytical Tool

    ERIC Educational Resources Information Center

    Yasukawa, Keiko; Brown, Tony; Black, Stephen

    2013-01-01

    Public policy discourses claim that there is a "crisis" in the literacy and numeracy levels of the Australian workforce. In this paper, we propose a methodology for examining this "crisis" from a critical perspective. We draw on findings from an ongoing research project by the authors which investigates production workers'…

  1. Communication research between working capacity of hard- alloy cutting tools and fractal dimension of their wear

    NASA Astrophysics Data System (ADS)

    Arefiev, K.; Nesterenko, V.; Daneykina, N.

    2016-06-01

    The results of communication research between the wear resistance of the K applicability hard-alloy cutting tools and the fractal dimension of the wear surface, which is formed on a back side of the cutting edge when processing the materials showing high adhesive activity are presented in the paper. It has been established that the wear resistance of tested cutting tools samples increases according to a fractal dimension increase of their wear surface.

  2. Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Bartels, R. E.

    2008-01-01

    NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.

  3. A Tool for Measuring NASA's Aeronautics Research Progress Toward Planned Strategic Community Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.

  4. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools

    PubMed Central

    Blevins, Meridith; Wehbe, Firas H.; Rebeiro, Peter F.; Caro-Vega, Yanink; McGowan, Catherine C.; Shepherd, Bryan E.

    2016-01-01

    Objective To develop and disseminate tools for interactive visualization of HIV cohort data. Design and Methods If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language). The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP), and our implementation utilized Caribbean, Central and South America network (CCASAnet) data. Results This tool currently presents patient-level data in three classes of plots: (1) Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2) Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3) Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART) initiation, CD4 trajectories after ART initiation, and mortality. Conclusions We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community. PMID:26963255

  5. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    PubMed

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  6. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    SciTech Connect

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usable search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.

  7. An emerging micro-scale immuno-analytical diagnostic tool to see the unseen. Holding promise for precision medicine and P4 medicine.

    PubMed

    Guzman, Norberto A; Guzman, Daniel E

    2016-05-15

    Over the years, analytical chemistry and immunology have contributed significantly to the field of clinical diagnosis by introducing quantitative techniques that can detect crucial and distinct chemical, biochemical and cellular biomarkers present in biosamples. Currently, quantitative two-dimensional hybrid immuno-analytical separation technologies are emerging as powerful tools for the sequential isolation, separation and detection of protein panels, including those with subtle structural changes such as variants, isoforms, peptide fragments, and post-translational modifications. One such technique to perform this challenging task is immunoaffinity capillary electrophoresis (IACE), which combines the use of antibodies and/or other affinity ligands as highly selective capture agents with the superior resolving power of capillary electrophoresis. Since affinity ligands can be polyreactive, i.e., binding and capturing more than one molecule, they may generate false positive results when tested under mono-dimensional procedures; one such application is enzyme-linked immunosorbent assay (ELISA). IACE, on the other hand, is a two-dimensional technique that captures (isolation and enrichment), releases, separates and detects (quantification, identification and characterization) a single or a panel of analytes from a sample, when coupled to one or more detectors simultaneously, without the presence of false positive or false negative data. This disruptive technique, capable of preconcentrate on-line results in enhanced sensitivity even in the analysis of complex matrices, may change the traditional system of testing biomarkers to obtain more accurate diagnosis of diseases, ideally before symptoms of a specific disease manifest. In this manuscript, we will present examples of the determination of biomarkers by IACE and the design of a miniaturized multi-dimensional IACE apparatus capable of improved sensitivity, specificity and throughput, with the potential of being used

  8. Immersive virtual environment technology: a promising tool for future social and behavioral genomics research and practice.

    PubMed

    Persky, Susan; McBride, Colleen M

    2009-12-01

    Social and behavioral research needs to get started now if scientists are to direct genomic discoveries to address pressing public health problems. Advancing social and behavioral science will require innovative and rigorous communication methodologies that move researchers beyond reliance on traditional tools and their inherent limitations. One such emerging research tool is immersive virtual environment technology (virtual reality), a methodology that gives researchers the ability to maintain high experimental control and mundane realism of scenarios; portray and manipulate complex, abstract objects and concepts; and implement innovative implicit behavioral measurement. This report suggests the role that immersive virtual environment technology can play in furthering future research in genomics-related education, decision making, test intentions, behavior change, and health-care provider behaviors. Practical implementation and challenges are also discussed.

  9. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research.

    PubMed

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices.

  10. Validation of the 4D NCAT simulation tools for use in high-resolution x-ray CT research

    NASA Astrophysics Data System (ADS)

    Segars, W. P.; Mahesh, Mahadevappa; Beck, T.; Frey, E. C.; Tsui, B. M. W.

    2005-04-01

    We validate the computer-based simulation tools developed in our laboratory for use in high-resolution CT research. The 4D NURBS-based cardiac-torso (NCAT) phantom was developed to provide a realistic and flexible model of the human anatomy and physiology. Unlike current phantoms in CT, the 4D NCAT has the advantage, due to its design, that its organ shapes can be changed to realistically model anatomical variations and patient motion. To efficiently simulate high-resolution CT images, we developed a unique analytic projection algorithm (including scatter and quantum noise) to accurately calculate projections directly from the surface definition of the phantom given parameters defining the CT scanner and geometry. The projection data are reconstructed into CT images using algorithms developed in our laboratory. The 4D NCAT phantom contains a level of detail that is close to impossible to produce in a physical test object. We, therefore, validate our CT simulation tools and methods through a series of direct comparisons with data obtained experimentally using existing, simple physical phantoms at different doses and using different x-ray energy spectra. In each case, the first-order simulations were found to produce comparable results (<12%). We reason that since the simulations produced equivalent results using simple test objects, they should be able to do the same in more anatomically realistic conditions. We conclude that, with the ability to provide realistic simulated CT image data close to that from actual patients, the simulation tools developed in this work will have applications in a broad range of CT imaging research.

  11. About Skinner and Time: Behavior-Analytic Contributions to Research on Animal Timing

    ERIC Educational Resources Information Center

    Lejeune, Helga; Richelle, Marc; Wearden, J. H.

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in "The Behavior of Organisms," through the rate…

  12. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  13. ANALYTIC ELEMENT GROUND WATER MODELING AS A RESEARCH PROGRAM (1980-2006)

    EPA Science Inventory

    Scientists and engineers who use the analytic element method (AEM) for solving problems of regional ground water flow may be considered a community, and this community can be studied from the perspective of history and philosophy of science. Applying the methods of the Hungarian...

  14. Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993

    ERIC Educational Resources Information Center

    O'Driscoll, Gillian A.; Callahan, Brandy L.

    2008-01-01

    Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…

  15. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    EPA Science Inventory

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  16. Scientific Mobility and International Research Networks: Trends and Policy Tools for Promoting Research Excellence and Capacity Building

    ERIC Educational Resources Information Center

    Jacob, Merle; Meek, V. Lynn

    2013-01-01

    One of the ways in which globalization is manifesting itself in higher education and research is through the increasing importance and emphasis on scientific mobility. This article seeks to provide an overview and analysis of current trends and policy tools for promoting mobility. The article argues that the mobility of scientific labour is an…

  17. [Analysis of researchers' implication in a research-intervention in the Stork Network: a tool for institutional analysis].

    PubMed

    Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles

    2016-09-19

    This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.

  18. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    PubMed

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  19. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    PubMed

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices.

  20. Research-tool patents: issues for health in the developing world.

    PubMed Central

    Barton, John H.

    2002-01-01

    The patent system is now reaching into the tools of medical research, including gene sequences themselves. Many of the new patents can potentially preempt large areas of medical research and lay down legal barriers to the development of a broad category of products. Researchers must therefore consider redesigning their research to avoid use of patented techniques, or expending the effort to obtain licences from those who hold the patents. Even if total licence fees can be kept low, there are enormous negotiation costs, and one "hold-out" may be enough to lead to project cancellation. This is making it more difficult to conduct research within the developed world, and poses important questions for the future of medical research for the benefit of the developing world. Probably the most important implication for health in the developing world is the possible general slowing down and complication of medical research. To the extent that these patents do slow down research, they weaken the contribution of the global research community to the creation and application of medical technology for the benefit of developing nations. The patents may also complicate the granting of concessional prices to developing nations - for pharmaceutical firms that seek to offer a concessional price may have to negotiate arrangements with research-tool firms, which may lose royalties as a result. Three kinds of response are plausible. One is to develop a broad or global licence to permit the patented technologies to be used for important applications in the developing world. The second is to change technical patent law doctrines. Such changes could be implemented in developed and developing nations and could be quite helpful while remaining consistent with TRIPS. The third is to negotiate specific licence arrangements, under which specific research tools are used on an agreed basis for specific applications. These negotiations are difficult and expensive, requiring both scientific and

  1. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  2. Dietary MicroRNA Database (DMD): An Archive Database and Analytic Tool for Food-Borne microRNAs.

    PubMed

    Chiang, Kevin; Shu, Jiang; Zempleni, Janos; Cui, Juan

    2015-01-01

    With the advent of high throughput technology, a huge amount of microRNA information has been added to the growing body of knowledge for non-coding RNAs. Here we present the Dietary MicroRNA Databases (DMD), the first repository for archiving and analyzing the published and novel microRNAs discovered in dietary resources. Currently there are fifteen types of dietary species, such as apple, grape, cow milk, and cow fat, included in the database originating from 9 plant and 5 animal species. Annotation for each entry, a mature microRNA indexed as DM0000*, covers information of the mature sequences, genome locations, hairpin structures of parental pre-microRNAs, cross-species sequence comparison, disease relevance, and the experimentally validated gene targets. Furthermore, a few functional analyses including target prediction, pathway enrichment and gene network construction have been integrated into the system, which enable users to generate functional insights through viewing the functional pathways and building protein-protein interaction networks associated with each microRNA. Another unique feature of DMD is that it provides a feature generator where a total of 411 descriptive attributes can be calculated for any given microRNAs based on their sequences and structures. DMD would be particularly useful for research groups studying microRNA regulation from a nutrition point of view. The database can be accessed at http://sbbi.unl.edu/dmd/.

  3. Dietary MicroRNA Database (DMD): An Archive Database and Analytic Tool for Food-Borne microRNAs

    PubMed Central

    Chiang, Kevin; Shu, Jiang; Zempleni, Janos; Cui, Juan

    2015-01-01

    With the advent of high throughput technology, a huge amount of microRNA information has been added to the growing body of knowledge for non-coding RNAs. Here we present the Dietary MicroRNA Databases (DMD), the first repository for archiving and analyzing the published and novel microRNAs discovered in dietary resources. Currently there are fifteen types of dietary species, such as apple, grape, cow milk, and cow fat, included in the database originating from 9 plant and 5 animal species. Annotation for each entry, a mature microRNA indexed as DM0000*, covers information of the mature sequences, genome locations, hairpin structures of parental pre-microRNAs, cross-species sequence comparison, disease relevance, and the experimentally validated gene targets. Furthermore, a few functional analyses including target prediction, pathway enrichment and gene network construction have been integrated into the system, which enable users to generate functional insights through viewing the functional pathways and building protein-protein interaction networks associated with each microRNA. Another unique feature of DMD is that it provides a feature generator where a total of 411 descriptive attributes can be calculated for any given microRNAs based on their sequences and structures. DMD would be particularly useful for research groups studying microRNA regulation from a nutrition point of view. The database can be accessed at http://sbbi.unl.edu/dmd/. PMID:26030752

  4. The Solid Earth Research and Teaching Environment, a new software framework to share research tools in the classroom and across disciplines

    NASA Astrophysics Data System (ADS)

    Milner, K.; Becker, T. W.; Boschi, L.; Sain, J.; Schorlemmer, D.; Waterhouse, H.

    2009-12-01

    The Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software framework to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. SEATREE is open source and community developed, distributed freely under the GNU General Public License. It is a fully contained package that lets users operate in a graphical mode, while giving more advanced users the opportunity to view and modify the source code. Top level graphical user interfaces which initiate the calculations and visualize results, are written in the Python programming language using an object-oriented, modern design. Results are plotted with either Matlab-like Python libraries, or SEATREE’s own Generic Mapping Tools wrapper. The underlying computational codes used to produce the results can be written in any programming language and accessed through Python wrappers. There are currently four fully developed science modules for SEATREE: (1) HC is a global geodynamics tool based on a semi-analytical mantle-circulation program based on work by B. Steinberger, Becker, and C. O'Neill. HC can compute velocities and tractions for global, spherical Stokes flow and radial viscosity variations. HC is fast enough to be used for classroom instruction, for example to let students interactively explore the role of radial viscosity variations for global geopotential (geoid) anomalies. (2) ConMan wraps Scott King’s 2D finite element mantle convection code, allowing users to quickly observe how modifications to input parameters affect heat flow over time. As seismology modules, SEATREE includes, (3), Larry, a global, surface wave phase-velocity inversion tool and, (4), Syn2D, a Cartesian tomography teaching tool for ray-theory wave propagation in synthetic, arbitrary velocity structure in the presence of noise. Both underlying programs were contributed by Boschi. Using Syn2D, students can explore, for example, how well a given

  5. CorRECTreatment: A Web-based Decision Support Tool for Rectal Cancer Treatment that Uses the Analytic Hierarchy Process and Decision Tree

    PubMed Central

    Karakülah, G.; Dicle, O.; Sökmen, S.; Çelikoğlu, C.C.

    2015-01-01

    Summary Background The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians’ decision making. Objective The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. Methods The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. Results In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratio<0.1). Depending on the decisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. Conclusions The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options

  6. The Research-Teaching Nexus: Using a Construction Teaching Event as a Research Tool

    ERIC Educational Resources Information Center

    Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday

    2016-01-01

    In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…

  7. Dancing on the Grid: using e-Science tools to extend choreographic research.

    PubMed

    Bailey, Helen; Bachler, Michelle; Buckingham Shum, Simon; Le Blanc, Anja; Popat, Sita; Rowley, Andrew; Turner, Martin

    2009-07-13

    This paper considers the role and impact of new and emerging e-Science tools on practice-led research in dance. Specifically, it draws on findings from the e-Dance project. This 2-year project brings together an interdisciplinary team combining research aspects of choreography, next generation of videoconferencing and human-computer interaction analysis incorporating hypermedia and nonlinear annotations for recording and documentation.

  8. Improving the Usefulness of Concept Maps as a Research Tool for Science Education

    ERIC Educational Resources Information Center

    Van Zele, Els; Lenaerts, Josephina; Wieme, Willem

    2004-01-01

    The search for authentic science research tools to evaluate student understanding in a hybrid learning environment with a large multimedia component has resulted in the use of concept maps as a representation of student's knowledge organization. One hundred and seventy third-semester introductory university-level engineering students represented…

  9. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    NASA Technical Reports Server (NTRS)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  10. Community College Fundraising: The Voluntary Support of Education Survey as a Sampling Tool for Research

    ERIC Educational Resources Information Center

    Wagoner, Richard L.; Besikof, Rudolph J.

    2011-01-01

    This article describes the Voluntary Support for Education (VSE) Survey, an instrument created by the Council for Aid to Education. Our objective is to explain VSE's potential value as a tool to inform both institutional and academic research regarding fund-raising activities at community colleges. Of particular interest is how the data available…

  11. Family Myths, Beliefs, and Customs as a Research/Educational Tool to Explore Identity Formation

    ERIC Educational Resources Information Center

    Herman, William E.

    2008-01-01

    This paper outlines a qualitative research tool designed to explore personal identity formation as described by Erik Erikson and offers self-reflective and anonymous evaluative comments made by college students after completing this task. Subjects compiled a list of 200 myths, customs, fables, rituals, and beliefs from their family of origin and…

  12. Basic Reference Tools for Nursing Research. A Workbook with Explanations and Examples.

    ERIC Educational Resources Information Center

    Smalley, Topsy N.

    This workbook is designed to introduce nursing students to basic concepts and skills needed for searching the literatures of medicine, nursing, and allied health areas for materials relevant to specific information needs. The workbook introduces the following research tools: (1) the National Library of Medicine's MEDLINE searches, including a…

  13. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    ERIC Educational Resources Information Center

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  14. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    ERIC Educational Resources Information Center

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  15. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  16. DMPwerkzeug - A tool to support the planning, implementation, and organization of research data management.

    NASA Astrophysics Data System (ADS)

    Klar, Jochen; Engelhardt, Claudia; Neuroth, Heike; Enke, Harry

    2016-04-01

    Following the call to make the results of publicly funded research openly accessible, more and more funding agencies demand the submission of a data management plan (DMP) as part of the application process. These documents specify, how the data management of the project is organized and what datasets will be published when. Of particular importance for European researchers is the Open Data Research Pilot of Horizon 2020 which requires data management plans for a set of 9 selected research fields from social sciences to nanotechnology. In order to assist the researchers creating these documents, several institutions developed dedicated software tools. The most well-known are DMPonline by the Digital Curation Centre (DCC) and DMPtool by the California Digital Library (CDL) - both extensive and well received web applications. The core functionality of these tools is the assisted editing of the DMP templates provided by the particular funding agency.While this is certainly helpful, especially in an environment with a plethora of different funding agencies like the UK or the USA, these tools are somewhat limited to this particular task and don't utilise the full potential of DMP. Beyond the purpose of fulfilling funder requirements, DMP can be useful for a number of additional tasks. In the initial conception phase of a project, they can be used as a planning tool to determine which date management activities and measures are necessary throughout the research process, to assess which resources are needed, and which institutions (computing centers, libraries, data centers) should be involved. During the project, they can act as a constant reference or guideline for the handling of research data. They also determine where the data will be stored after the project has ended and whether it can be accessed by the public, helping to take into account resulting requirements of the data center or actions necessary to ensure re-usability by others from early on. Ideally, a DMP

  17. An experimental and analytical method for approximate determination of the tilt rotor research aircraft rotor/wing download

    NASA Technical Reports Server (NTRS)

    Jordon, D. E.; Patterson, W.; Sandlin, D. R.

    1985-01-01

    The XV-15 Tilt Rotor Research Aircraft download phenomenon was analyzed. This phenomenon is a direct result of the two rotor wakes impinging on the wing upper surface when the aircraft is in the hover configuration. For this study the analysis proceeded along tow lines. First was a method whereby results from actual hover tests of the XV-15 aircraft were combined with drag coefficient results from wind tunnel tests of a wing that was representative of the aircraft wing. Second, an analytical method was used that modeled that airflow caused gy the two rotors. Formulas were developed in such a way that acomputer program could be used to calculate the axial velocities were then used in conjunction with the aforementioned wind tunnel drag coefficinet results to produce download values. An attempt was made to validate the analytical results by modeling a model rotor system for which direct download values were determinrd..

  18. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    PubMed

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation.

  19. Photovoice as participatory action research tool for engaging people with intellectual disabilities in research and program development.

    PubMed

    Jurkowski, Janine M

    2008-02-01

    People with intellectual disabilities have few opportunities to actively participate in research affecting programs and policies. Employment of participatory action research has been recommended. Although use of this approach with people who have intellectual disabilities is growing, articles on specific participatory research methods are rare. Photovoice is a participatory method often used with underrepresented groups and is effective for engaging people with intellectual disabilities in research or program development. A literature review is presented for use with this population as is a description of Photovoice as a participatory research tool for engaging people with intellectual disabilities. An example of a participatory study among people with intellectual disabilities is provided. Benefits and challenges of employing Photovoice with this population are discussed.

  20. Evaluation of analytical tools and multivariate methods for quantification of co-former crystals in ibuprofen-nicotinamide co-crystals.

    PubMed

    Soares, Frederico Luis Felipe; Carneiro, Renato Lajarim

    2014-02-01

    Co-crystals are multicomponent substances designed by the addition of two or more different molecules in a same crystallographic pattern, in which it differs from the crystallographic motif of its co-formers. The addition of highly soluble molecules, like nicotinamide, in the crystallographic pattern of ibuprofen enhances its solubility more than 7.5 times, improving the properties of this widely used drug. Several analytical solid state techniques are used to characterize the ibuprofen-nicotinamide co-crystal, being the most used: mid-infrared (ATR-FTIR), differential scanning calorimetry (DSC), X-ray diffraction (XRPD) and Raman spectroscopy. These analytical solid state techniques were evaluated to quantify a mixture of ibuprofen-nicotinamide co-crystal and its co-formers in order to develop a calibration model to evaluate the co-crystal purity after its synthesis. Raman spectroscopy showed better result than all other techniques with a combination of multivariate calibration tools, presenting lower values of calibration and prediction errors. The partial least squares regression model gave a mean error lower than 5% for all components presented in the mixture. DSC and mid-infrared spectroscopy proved to be insufficient for quantification of the ternary mixture. XRPD presented good results for quantification of the co-formers, ibuprofen and nicotinamide, but fair results for the co-crystal. This is the first report of quantification of ibuprofen-nicotinamide co-crystal, among its co-formers. The quantification is of great importance to determine the yield of the co-crystallization reactions and the purity of the product obtained.

  1. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    NASA Technical Reports Server (NTRS)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  2. Building genetic tools in Drosophila research: an interview with Gerald Rubin

    PubMed Central

    2016-01-01

    Gerald (Gerry) Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms. PMID:27053132

  3. The Association of Religion Data Archives (ARDA): Online Research Data, Tools, and References.

    PubMed

    Finke, Roger; Adamczyk, Amy

    2008-12-01

    The Association of Religion Data Archives (ARDA) currently archives over 400 local, national, and international data files, and offers a wide range of research tools to build surveys, preview data on-line, develop customized maps and reports of U.S. church membership, and examine religion differences across nations and regions of the world. The ARDA also supports reference and teaching tools that draw on the rich data archive. This research note offers a brief introduction to the quantitative data available for exploration or download, and a few of the website features most useful for research and teaching. Supported by the Lilly Endowment, the John Templeton Foundation, the Pennsylvania State University, and the Baylor Institute for Studies of Religion, all data downloads and online services are free of charge.

  4. Standardized nursing care plan: a case study on developing a tool for clinical research.

    PubMed

    Vizoso, Hector; Lyskawa, Meg; Couey, Paul

    2008-08-01

    The National Institutes of Health have developed a new organizational consortium through a funding mechanism called the Clinical and Translational Science Award. This program funds academic institutions to create a platform for research that expedites the development and delivery of new treatments through open interdisciplinary collaboration. As a result, the adult clinical research center at San Francisco General Hospital is now part of the Clinical and Translational Science Institute at the University of California San Francisco. Nurses on this research unit have begun to employ a standardized nursing care plan that focuses on the particular needs of the research participant, an advancement that if implemented nationwide among all adult clinical research centers will be of paramount importance in fostering a collaborative relationship within the new organizational structure. This standardized nursing care plan will provide research nurses with a tool that will enable them to provide safe and quality patient care.

  5. An Analytic Study of the Professional Development Research in Early Childhood Education

    ERIC Educational Resources Information Center

    Schachter, Rachel E.

    2015-01-01

    Research Findings: The goal of this study was to examine empirical research on the design, delivery, and measurement of the effects of professional development (PD) for early childhood educators in order to provide insight into what the field has accomplished as well as suggest directions for future PD programs and research. Through the use of…

  6. Psychometric evaluation of a short observational tool for small-scale research projects in dementia.

    PubMed

    Smallwood, J; Irvine, E; Coulter, F; Connery, H

    2001-03-01

    Dementia is a degenerating illness and the lack of a reliable measure of self-report in particular presents particular difficulties for research. Often in the later stages of dementia behavioural measurement is the only tool available for the evaluation of treatment techniques. This paper describes and evaluates a short observational tool suitable for clinical assessment purposes. The scale has been shown to have the potential for adequate inter-rater reliability, test retest reliability, and convergent and divergent validity, if the study limitations reflecting statistical rather than ecological validity, and limitations of sample size are borne in mind.

  7. Toward Rigorous Idiographic Research in Prevention Science: Comparison Between Three Analytic Strategies for Testing Preventive Intervention in Very Small Samples

    PubMed Central

    Pineo, Thomas Z.; Maldonado Molina, Mildred M.; Lich, Kristen Hassmiller

    2013-01-01

    Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/ adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N=4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates. PMID:23299558

  8. DataUp 2.0: Improving On a Tool For Helping Researchers Archive, Manage, and Share Their Tabular Data

    NASA Astrophysics Data System (ADS)

    Strasser, C.; Borda, S.; Cruse, P.; Kunze, J.

    2013-12-01

    There are many barriers to data management and sharing among earth and environmental scientists; among the most significant are a lack of knowledge about best practices for data management, metadata standards, or appropriate data repositories for archiving and sharing data. Last year we developed an open source web application, DataUp, to help researchers overcome these barriers. DataUp helps scientists to (1) determine whether their file is CSV compatible, (2) generate metadata in a standard format, (3) retrieve an identifier to facilitate data citation, and (4) deposit their data into a repository. With funding from the NSF via a supplemental grant to the DataONE project, we are working to improve upon DataUp. Our main goal for DataUp 2.0 is to ensure organizations and repositories are able to adopt and adapt DataUp to meet their unique needs, including connecting to analytical tools, adding new metadata schema, and expanding the list of connected data repositories. DataUp is a collaborative project between the California Digital Library, DataONE, the San Diego Supercomputing Center, and Microsoft Research Connections.

  9. Integrated Decision-Making Tool to Develop Spent Fuel Strategies for Research Reactors

    SciTech Connect

    Beatty, Randy L; Harrison, Thomas J

    2016-01-01

    IAEA Member States operating or having previously operated a Research Reactor are responsible for the safe and sustainable management and disposal of associated radioactive waste, including research reactor spent nuclear fuel (RRSNF). This includes the safe disposal of RRSNF or the corresponding equivalent waste returned after spent fuel reprocessing. One key challenge to developing general recommendations lies in the diversity of spent fuel types, locations and national/regional circumstances rather than mass or volume alone. This is especially true given that RRSNF inventories are relatively small, and research reactors are rarely operated at a high power level or duration typical of commercial power plants. Presently, many countries lack an effective long-term policy for managing RRSNF. This paper presents results of the International Atomic Energy Agency (IAEA) Coordinated Research Project (CRP) #T33001 on Options and Technologies for Managing the Back End of the Research Reactor Nuclear Fuel Cycle which includes an Integrated Decision Making Tool called BRIDE (Back-end Research reactor Integrated Decision Evaluation). This is a multi-attribute decision-making tool that combines the Total Estimated Cost of each life-cycle scenario with Non-economic factors such as public acceptance, technical maturity etc and ranks optional back-end scenarios specific to member states situations in order to develop a specific member state strategic plan with a preferred or recommended option for managing spent fuel from Research Reactors.

  10. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-04-30

    burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...milestones. Prior to this, there was no formal requirement to reconcile program acquisition baselines with resource forecasts beyond the five years of...programs at key milestones. Prior to this, there was no formal requirement to reconcile program acquisition baselines with resource forecasts beyond the

  11. Analytical tools in accelerator physics

    SciTech Connect

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  12. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    production costs over time, in the context of  The cost and schedules of the other programs in the relevant acquisition portfolio  The...Procurement + RDT&E Portfolio $B POM14-18 …FY43 All Portfolios Under Component TOA F Transportation (Procurement + RDT&E) TWV O&M Requirements BY12 T...Portion of O&M Procurement + RDT&E Portfolios New System Total Lifecycle Costs (total reserved profile) Recommended Submission Formats (DAG) 5

  13. The Scottish Government's Rural and Environmental Science and Analytical Services Strategic Research Progamme

    NASA Astrophysics Data System (ADS)

    Dawson, Lorna; Bestwick, Charles

    2013-04-01

    The Strategic Research Programme focuses on the delivery of outputs and outcomes within the major policy agenda areas of climate change, land use and food security, and to impact on the 'Wealthier', 'Healthier' and 'Greener' strategic objectives of the Scottish Government. The research is delivered through two programmes: 'Environmental Change' and 'Food, Land and People'; the core strength of which is the collaboration between the Scottish Government's Main Research Providers-The James Hutton Institute, the Moredun Research Institute, Rowett Institute of Nutrition and Health University of Aberdeen, Scotland's Rural College, Biomathematics and Statistics Scotland and The Royal Botanic Gardens Edinburgh. The research actively seeks to inform and be informed by stakeholders from policy, farming, land use, water and energy supply, food production and manufacturing, non-governmental organisations, voluntary organisations, community groups and general public. This presentation will provide an overview of the programme's interdisciplinary research, through examples from across the programme's themes. Examples will exemplify impact within the Strategic Programme's priorities of supporting policy and practice, contributing to economic growth and innovation, enhancing collaborative and multidisciplinary research, growing scientific resilience and delivering scientific excellence. http://www.scotland.gov.uk/Topics/Research/About/EBAR/StrategicResearch/future-research-strategy/Themes/ http://www.knowledgescotland.org/news.php?article_id=295

  14. Enhancing the Analytic Utility of the Synthetic Theater Operations Research Model (STORM)

    DTIC Science & Technology

    2014-12-01

    Operations Research Model ( STORM ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER...SYNTHETIC THEATER OPERATIONS RESEARCH MODEL ( STORM ) Mary L. McDonald Stephen C. Upton Christian N. Seymour Thomas W. Lucas Susan M. Sanchez...Force and Marine Corps, use the Synthetic Theater Operations Research Model ( STORM ) to assess risk in an integrated, campaign setting. Ultimately

  15. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.

  16. Transdisciplinarity among tobacco harm-reduction researchers: a network analytic approach.

    PubMed

    Provan, Keith G; Clark, Pamela I; Huerta, Timothy

    2008-08-01

    Progress in tobacco control and other areas of health research is thought to be heavily influenced by the extent to which researchers are able to work with each other not only within, but also across disciplines. This study provides an examination of the extent to which researchers in the area of tobacco harm reduction work together. Specifically, data were collected in 2005 from a national group of 67 top tobacco-control researchers from eight broadly defined disciplines representing 17 areas of expertise. Network analysis was utilized to examine the extent to which these researchers were engaged in research that was interdisciplinary or transdisciplinary, based on the outcome or product attained. Findings revealed that interdisciplinary network ties were much denser than transdisciplinary ties, but researchers in some disciplines were more likely to work across disciplines than others, especially when synergistic outcomes resulted. The study demonstrates for the first time how tobacco-control researchers work together, providing direction for policy officials seeking to encourage greater transdisciplinarity. The study also demonstrates the value of network-analysis methods for understanding research relationships in one important area of health care.

  17. Evaluation of manometric temperature measurement (MTM), a process analytical technology tool in freeze drying, part III: heat and mass transfer measurement.

    PubMed

    Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J

    2006-01-01

    This article evaluates the procedures for determining the vial heat transfer coefficient and the extent of primary drying through manometric temperature measurement (MTM). The vial heat transfer coefficients (Kv) were calculated from the MTM-determined temperature and resistance and compared with Kv values determined by a gravimetric method. The differences between the MTM vial heat transfer coefficients and the gravimetric values are large at low shelf temperature but smaller when higher shelf temperatures were used. The differences also became smaller at higher chamber pressure and smaller when higher resistance materials were being freeze-dried. In all cases, using thermal shields greatly improved the accuracy of the MTM Kv measurement. With use of thermal shields, the thickness of the frozen layer calculated from MTM is in good agreement with values obtained gravimetrically. The heat transfer coefficient "error" is largely a direct result of the error in the dry layer resistance (ie, MTM-determined resistance is too low). This problem can be minimized if thermal shields are used for freeze-drying. With suitable use of thermal shields, accurate Kv values are obtained by MTM; thus allowing accurate calculations of heat and mass flow rates. The extent of primary drying can be monitored by real-time calculation of the amount of remaining ice using MTM data, thus providing a process analytical tool that greatly improves the freeze-drying process design and control.

  18. Effect of Percent Relative Humidity, Moisture Content, and Compression Force on Light-Induced Fluorescence (LIF) Response as a Process Analytical Tool.

    PubMed

    Shah, Ishan G; Stagner, William C

    2016-08-01

    The effect of percent relative humidity (16-84% RH), moisture content (4.2-6.5% w/w MC), and compression force (4.9-44.1 kN CF) on the light-induced fluorescence (LIF) response of 10% w/w active pharmaceutical ingredient (API) compacts is reported. The fluorescent response was evaluated using two separate central composite designs of experiments. The effect of % RH and CF on the LIF signal was highly significant with an adjusted R (2)  = 0.9436 and p < 0.0001. Percent relative humidity (p = 0.0022), CF (p < 0.0001), and % RH(2) (p = 0.0237) were statistically significant factors affecting the LIF response. The effects of MC and CF on LIF response were also statistically significant with a p value <0.0001 and adjusted R (2) value of 0.9874. The LIF response was highly impacted by MC (p < 0.0001), CF (p < 0.0001), and MC(2) (p = 0022). At 10% w/w API, increased % RH, MC, and CF led to a nonlinear decrease in LIF response. The derived quadratic model equations explained more than 94% of the data. Awareness of these effects on LIF response is critical when implementing LIF as a process analytical tool.

  19. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    PubMed

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).

  20. LC/ESI-MS n and 1H HR-MAS NMR analytical methods as useful taxonomical tools within the genus Cystoseira C. Agardh (Fucales; Phaeophyceae).

    PubMed

    Jégou, Camille; Culioli, Gérald; Kervarec, Nelly; Simon, Gaëlle; Stiger-Pouvreau, Valérie

    2010-12-15

    Species of the genus Cystoseira are particularly hard to discriminate, due to the complexity of their morphology, which can be influenced by their phenological state and ecological parameters. Our study emphasized on the relevance of two kinds of analytical tools, (1) LC/ESI-MS(n) and (2) (1)H HR-MAS NMR, also called in vivo NMR, to identify Cystoseira specimens at the specific level and discuss their taxonomy. For these analyses, samples were collected at several locations in Brittany (France), where Cystoseira baccata, C. foeniculacea, C. humilis, C. nodicaulis and C. tamariscifolia were previously reported. To validate our chemical procedure, the sequence of the ITS2 has been obtained for each species to investigate their phylogenetic relationships at a molecular level. Our study highlighted the consistency of the two physico-chemical methods, compared to "classical" molecular approach, in studying taxonomy within the genus Cystoseira. Especially, LC/ESI-MS(n) and phylogenetic analyses converged into the discrimination of two taxonomical groups among the 5 species. The occurrence of some specific signals in the (1)H HR-MAS NMR spectra and/or some characteristic chemical compounds during LC/ESI-MS(n) analysis could be regarded as discriminating factors. LC/ESI-MS(n) and (1)H HR-MAS NMR turned out to be two relevant and innovative techniques to discriminate taxonomically this complex genus.

  1. Conceptual framework for outcomes research studies of hepatitis C: an analytical review.

    PubMed

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments.

  2. Conceptual framework for outcomes research studies of hepatitis C: an analytical review

    PubMed Central

    Sbarigia, Urbano; Denee, Tom R; Turner, Norris G; Wan, George J; Morrison, Alan; Kaufman, Anna S; Rice, Gary; Dusheiko, Geoffrey M

    2016-01-01

    Hepatitis C virus infection is one of the main causes of chronic liver disease worldwide. Until recently, the standard antiviral regimen for hepatitis C was a combination of an interferon derivative and ribavirin, but a plethora of new antiviral drugs is becoming available. While these new drugs have shown great efficacy in clinical trials, observational studies are needed to determine their effectiveness in clinical practice. Previous observational studies have shown that multiple factors, besides the drug regimen, affect patient outcomes in clinical practice. Here, we provide an analytical review of published outcomes studies of the management of hepatitis C virus infection. A conceptual framework defines the relationships between four categories of variables: health care system structure, patient characteristics, process-of-care, and patient outcomes. This framework can provide a starting point for outcomes studies addressing the use and effectiveness of new antiviral drug treatments. PMID:27313473

  3. Application of nuclear analytical techniques to biological and environmental research in the Czech Republic

    SciTech Connect

    Kucera, J.

    1996-12-31

    There is a long tradition in radiochemistry in the present Czech Republic that dates from 1856. However, the modern history of nuclear analytical techniques (NAT) development started after installation of the first experimental nuclear reactor and Van de Graaff accelerator in the mid-sixties at Rez. Since then, the NAT, such as neutron activation analysis (NAA) both instrumental (INAA) and radiochemical (RNAA), gamma activation analysis, particle-induced X-ray and gamma-ray emission (PIXE and PIGE, respectively), Rutherford backscattering, and neutron depth profiling have been continuously developed and applied in various scientific and technological fields. The radiochemical approach has always had a strong position in these investigations and resulted in the discovery of the substoichiometry separation principle in NAA and isotope dilution techniques and extensive utilization of RNAA. The use of INAA and RNAA in the evaluation of biological and environmental materials as well as plants is described.

  4. About Skinner and time: behavior-analytic contributions to research on animal timing.

    PubMed

    Lejeune, Helga; Richelle, Marc; Wearden, J H

    2006-01-01

    The article discusses two important influences of B. F. Skinner, and later workers in the behavior-analytic tradition, on the study of animal timing. The first influence is methodological, and is traced from the invention of schedules imposing temporal constraints or periodicities on animals in The Behavior of Organisms, through the rate differentiation procedures of Schedules of Reinforcement, to modern temporal psychophysics in animals. The second influence has been the development of accounts of animal timing that have tried to avoid reference to internal processes of a cognitive sort, in particular internal clock mechanisms. Skinner's early discussion of temporal control is first reviewed, and then three recent theories-Killeen & Fetterman's (1988) Behavioral Theory of Timing; Machado's (1997) Learning to Time; and Dragoi, Staddon, Palmer, & Buhusi's (2003) Adaptive Timer Model-are discussed and evaluated.

  5. The need for novel informatics tools for integrating and planning research in molecular and cellular cognition.

    PubMed

    Silva, Alcino J; Müller, Klaus-Robert

    2015-09-01

    The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other biology fields. Additionally, the multilevel integration process characteristic of this field involves the establishment of experimental connections between molecular, electrophysiological, behavioral, and even cognitive data. This multidisciplinary integration process requires strategies and approaches that originate in several different fields, which greatly increases the complexity and demands of this process. Although causal assertions, where phenomenon A is thought to contribute or relate to B, are at the center of this integration process and key to research in biology, there are currently no tools to help scientists keep track of the increasingly more complex network of causal connections they use when making research decisions. Here, we propose the development of semiautomated graphical and interactive tools to help neuroscientists and other biologists, including those working in molecular and cellular cognition, to track, map, and weight causal evidence in research papers. There is a great need for a concerted effort by biologists, computer scientists, and funding institutions to develop maps of causal information that would aid in integration of research findings and in experiment planning.

  6. The need for novel informatics tools for integrating and planning research in molecular and cellular cognition

    PubMed Central

    Müller, Klaus-Robert

    2015-01-01

    The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other biology fields. Additionally, the multilevel integration process characteristic of this field involves the establishment of experimental connections between molecular, electrophysiological, behavioral, and even cognitive data. This multidisciplinary integration process requires strategies and approaches that originate in several different fields, which greatly increases the complexity and demands of this process. Although causal assertions, where phenomenon A is thought to contribute or relate to B, are at the center of this integration process and key to research in biology, there are currently no tools to help scientists keep track of the increasingly more complex network of causal connections they use when making research decisions. Here, we propose the development of semiautomated graphical and interactive tools to help neuroscientists and other biologists, including those working in molecular and cellular cognition, to track, map, and weight causal evidence in research papers. There is a great need for a concerted effort by biologists, computer scientists, and funding institutions to develop maps of causal information that would aid in integration of research findings and in experiment planning. PMID:26286658

  7. Automated Tools for Clinical Research Data Quality Control using NCI Common Data Elements.

    PubMed

    Hudson, Cody L; Topaloglu, Umit; Bian, Jiang; Hogan, William; Kieber-Emmons, Thomas

    2014-01-01

    Clinical research data generated by a federation of collection mechanisms and systems often produces highly dissimilar data with varying quality. Poor data quality can result in the inefficient use of research data or can even require the repetition of the performed studies, a costly process. This work presents two tools for improving data quality of clinical research data relying on the National Cancer Institute's Common Data Elements as a standard representation of possible questions and data elements to A: automatically suggest CDE annotations for already collected data based on semantic and syntactic analysis utilizing the Unified Medical Language System (UMLS) Terminology Services' Metathesaurus and B: annotate and constrain new clinical research questions though a simple-to-use "CDE Browser." In this work, these tools are built and tested on the open-source LimeSurvey software and research data analyzed and identified to contain various data quality issues captured by the Comprehensive Research Informatics Suite (CRIS) at the University of Arkansas for Medical Sciences.

  8. A Research Analytics Framework-Supported Recommendation Approach for Supervisor Selection

    ERIC Educational Resources Information Center

    Zhang, Mingyu; Ma, Jian; Liu, Zhiying; Sun, Jianshan; Silva, Thushari

    2016-01-01

    Identifying a suitable supervisor for a new research student is vitally important for his or her academic career. Current information overload and information disorientation have posed significant challenges for new students. Existing research for supervisor identification focuses on quality assessment of candidates, but ignores indirect relevance…

  9. Homework and Academic Achievement: A Meta-Analytic Review of Research

    ERIC Educational Resources Information Center

    Bas, Gökhan; Sentürk, Cihad; Cigerci, Fatih Mehmet

    2017-01-01

    The main purpose of this study was to determine the effect of homework assignments on students' academic achievement. This meta-analysis sought an answer to the research question: "What kind of effect does homework assignment have on students' academic achievement levels?" In this research, meta-analysis was adopted to determine the…

  10. Identifying Key Priorities for Future Palliative Care Research Using an Innovative Analytic Approach

    PubMed Central

    Pillemer, Karl; Chen, Emily K.; Warmington, Marcus; Adelman, Ronald D.; Reid, M. C.

    2015-01-01

    Using an innovative approach, we identified research priorities in palliative care to guide future research initiatives. We searched 7 databases (2005–2012) for review articles published on the topics of palliative and hospice–end-of-life care. The identified research recommendations (n = 648) fell into 2 distinct categories: (1) ways to improve methodological approaches and (2) specific topic areas in need of future study. The most commonly cited priority within the theme of methodological approaches was the need for enhanced rigor. Specific topics in need of future study included perspectives and needs of patients, relatives, and providers; underrepresented populations; decision-making; cost-effectiveness; provider education; spirituality; service use; and interdisciplinary approaches to delivering palliative care. This review underscores the need for additional research on specific topics and methodologically rigorous research to inform health policy and practice. PMID:25393169

  11. Exploring Assessment Tools for Research and Evaluation in Astronomy Education and Outreach

    NASA Astrophysics Data System (ADS)

    Buxner, S. R.; Wenger, M. C.; Dokter, E. F. C.

    2011-09-01

    The ability to effectively measure knowledge, attitudes, and skills in formal and informal educational settings is an important aspect of astronomy education research and evaluation. Assessments may take the form of interviews, observations, surveys, exams, or other probes to help unpack people's understandings or beliefs. In this workshop, we discussed characteristics of a variety of tools that exist to assess understandings of different concepts in astronomy as well as attitudes towards science and science teaching; these include concept inventories, surveys, interview protocols, observation protocols, card sorting, reflection videos, and other methods currently being used in astronomy education research and EPO program evaluations. In addition, we discussed common questions in the selection of assessment tools including issues of reliability and validity, time to administer, format of implementation, analysis, and human subject concerns.

  12. A semi-automatic web based tool for the selection of research projects reviewers.

    PubMed

    Pupella, Valeria; Monteverde, Maria Eugenia; Lombardo, Claudio; Belardelli, Filippo; Giacomini, Mauro

    2014-01-01

    The correct evaluation of research proposals continues today to be problematic, and in many cases, grants and fellowships are subjected to this type of assessment. A web based semi-automatic tool to help in the selection of reviewers was developed. The core of the proposed system is the matching of the MeSH Descriptors of the publications submitted by the reviewers (for their accreditation) and the Descriptor linked to the research keywords, which were selected. Moreover, a citation related index was further calculated and adopted in order to discard not suitable reviewers. This tool was used as a support in a web site for the evaluation of candidates applying for a fellowship in the oncology field.

  13. Process of formulating USDA's Expanded Flavonoid Database for the Assessment of Dietary intakes: a new tool for epidemiological research.

    PubMed

    Bhagwat, Seema A; Haytowitz, David B; Wasswa-Kintu, Shirley I; Pehrsson, Pamela R

    2015-08-14

    The scientific community continues to be interested in potential links between flavonoid intakes and beneficial health effects associated with certain chronic diseases such as CVD, some cancers and type 2 diabetes. Three separate flavonoid databases (Flavonoids, Isoflavones and Proanthocyanidins) developed by the USDA Agricultural Research Service since 1999 with frequent updates have been used to estimate dietary flavonoid intakes, and investigate their health effects. However, each of these databases contains only a limited number of foods. The USDA has constructed a new Expanded Flavonoids Database for approximately 2900 commonly consumed foods, using analytical values from their existing flavonoid databases (Flavonoid Release 3.1 and Isoflavone Release 2.0) as the foundation to calculate values for all the twenty-nine flavonoid compounds included in these two databases. Thus, the new database provides full flavonoid profiles for twenty-nine predominant dietary flavonoid compounds for every food in the database. Original analytical values in Flavonoid Release 3.1 and Isoflavone Release 2.0 for corresponding foods were retained in the newly constructed database. Proanthocyanidins are not included in the expanded database. The process of formulating the new database includes various calculation techniques. This article describes the process of populating values for the twenty-nine flavonoid compounds for every food in the dataset, along with challenges encountered and resolutions suggested. The new expanded flavonoid database released on the Nutrient Data Laboratory's website would provide uniformity in estimations of flavonoid content in foods and will be a valuable tool for epidemiological studies to assess dietary intakes.

  14. Using Digital Video as a Research Tool: Ethical Issues for Researchers

    ERIC Educational Resources Information Center

    Schuck, Sandy; Kearney, Matthew

    2006-01-01

    Digital video and accompanying editing software are increasingly becoming more accessible for researchers in terms of ease of use and cost. The rich, visually appealing and seductive nature of video-based data can convey a strong sense of direct experience with the phenomena studied (Pea, 1999). However, the ease of selection and editing of…

  15. Human Rights Education and the Research Process: Action Research as a Tool for Reflection and Change

    ERIC Educational Resources Information Center

    Tavares, Celma

    2016-01-01

    Human rights education (HRE) aims to achieve a change of mindsets and social attitudes that entails the construction of a culture of respect towards those values it teaches. Although HRE is a recent field of study, its consolidation in Latin America is a fact. During the latest decades several authors have carried out research related to HRE that…

  16. A New Tool for Identifying Research Standards and Evaluating Research Performance

    ERIC Educational Resources Information Center

    Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki

    2012-01-01

    Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…

  17. Research on USMC Marksmanship Training Assessment Tools, Instructional Simulations, and Qualitative Field-Based Research

    DTIC Science & Technology

    2003-01-10

    delivery of remedial training through the instructional simulations. ONR is also funding CHI Systems, Inc. to conduct field-based qualitative ... research on Navy and Marine Corps DL implementations to develop practical guidelines and procedures to support effective DL employment in the Navy and Marine Corps.

  18. Involving children and young people in the development of art-based research tools.

    PubMed

    Coad, Jane; Plumridge, Gill; Metcalfe, Alison

    2009-01-01

    In this article, the authors describe how they worked with children and young people to develop art-based techniques and activities for use in a study exploring family communication about genetic conditions. It highlights key methodological issues about children and young people's participation in research, the concept of what constitutes an art-based activity and how this was applied to developing art-based data collection tools.

  19. Open Virtual Worlds as Pedagogical Research Tools: Learning from the Schome Park Programme

    NASA Astrophysics Data System (ADS)

    Twining, Peter; Peachey, Anna

    This paper introduces the term Open Virtual Worlds and argues that they are ‘unclaimed educational spaces’, which provide a valuable tool for researching pedagogy. Having explored these claims the way in which Teen Second Life® virtual world was used for pedagogical experimentation in the initial phases of the Schome Park Programme is described. Four sets of pedagogical dimensions that emerged are presented and illustrated with examples from the Schome Park Programme.

  20. DataUp: A tool to help researchers describe and share tabular data

    PubMed Central

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012. PMID:25653834